Welcome to TiddlyWiki created by Jeremy Ruston; Copyright © 2004-2007 Jeremy Ruston, Copyright © 2007-2011 UnaMesa Association
<<tiddler [[Risk Replication Manager]]>>
Acorn Firewall is the name of the firewall at OakvaleCapital that the DataScopeRateFeed was behind.
The firewall need to permit the DataScopeRateFeed to use FTP to access [[Reuters Datascope Select]]. If you put the DataScopeRateFeed into the DMZ, you need to permit RPC requests to it for the RateFeedDashboard. (The port is defined in the [[Configuration Files]] Clone section.
I suggest the DataScopeRateFeed be in the DMZ part of the firewall. At OakvaleCapital, we had it in the "inside" region to keep all the RateFeed applications in a single session for ease of management.
!! References
Defn: the firewall that DataScopeRateFeed sits behind
~ArchiveRateFeedDownloads.py is a quick Python script to keep the downloads folder (of DataScopeRateFeed) clean. I have not included it as it is a bit specific to OakvaleCapital, but [[Contact Us]] if you wish for a copy.
!! References
* DataScopeRateFeed
Defn: a management script
AvantGard is the SunGard product category that include [[Quantum]] and [[Risk]]
!! References
* SunGard
Defn: a product category
Bazaar is a source code management system.
At OakvaleCapital, we used Bazaar to manage the [[Python]] script that DataScopeRateFeed was implemented in, and the [[Configuration Files]] that defined what these applications did. However, any of the similar products would do an equially good job.
!! References
* http://bazaar.canonical.com/en/
Defn: a source code management system.
Blended Yield Curves are Yield Curves that contain both the Cash Curve and the Swaps Curve.
!! References
Defn: a Yield Curve that contain both the Cash Curve and the Swaps Curve.
Bloomberg are a source of [[Rates]] and a competitor to [[Reuters]]. [FRM]] used [[Bloomberg]] for rates validation and, presumably, to source some hard to find rates.
In theory, only one module of the DataScopeRateFeed would need to be modified to support Bloomberg as the principal rate source.
!! References
Defn: a source of [[Rates]] (not directly used by DataScopeRateFeed
Bonds are a form of issued debt that can be traded. It is priced using a market coupon rate (provided by [[Reuters Datascope Select]]), or an offset off a better known market coupon rate.
!! References
Defn: A financial instrument that is priced off of a coupon rate
Coal forward curves in OakvaleCapital are build such that there is no interpolation of prices between points. That is, if the June 2012 point is 116.25 and the December 2012 point is 115.75, then a November 2012 contract is priced at 116.25. (This is called "forward stepped" in the language of [[Configuration Files]] curve definition section. If the curve were "backward stepped", the November 2012 contract would be priced at the 115.75 rate in the example above.)
This article describes how to manually convert the curve, and then a couple of notes about what the RateFeed actual does
!! Manual Process
# Collect a strip of prices (in the case of coal, these are a "Coal Price Index") and the expiry date of each
# Determine the "Rate Date" required and ignore any prices that have expired by that date
# The spot price is the first expiring price of the strip
# The spot date is the "Rate Date" plus four if the "Rate Date" is a Thursday or Friday, otherwise plus two
# Convert each price into forward points by subtracting the "spot price"
# Convert each expiry date into a interval of days by subtracting the "spot date"
#* In excel, this is {{{="" & INT(E7 - $F$1) & "DAYS"}}}
#* This a rate date of 29-feb-2012 has a spot date of 2-mar-2012. The price expiring on 30-mar-2012 has an interval of "28 DAYS".
# Insert an interval each time the forward point value changes of five days before the next interval. It has the same forward point as the earlier interval
#* This if there is in interval if "28 DAYS" with forward point of 0 and "57 DAYS" with a forward point of 1.1, then add an interval of "52 DAYS" with a forward point of 0
#* Five days was worked out empirically a long time ago, and probably is the Easter break + 1
# Add a "30 YEAR" internal with a forward point of the last forward point of the strip.
#* This fixes extrapolation differences between [[Quantum]] and [[Risk]].
!! Notes from DataScope RateFeed
The DataScopeRateFeed gets strips of rates for Coal that includes months, quarters and years. By convention, we do not want to extrapolate rates between the boundaries of months to quarters and quarters to years. Thus if we have a month rates do not complete a quarter, we use the quarter rate. Ditto, if the quarter rates do not complete a (calendar) year, we use the annual rates.
!! References
* [[Configuration Files]]
The Cash Curve is the component of the [[Yield Curve]] that is quoted in [[Reuters Datascope Select]] on a cash basis.
!! References
Defn: a component of the [[Yield Curve]] that is quoted in [[Reuters Datascope Select]] on a cash basis.
! Cash Curve
<<tiddler [[Cash Curve]]>>
! Swaps Curve
<<tiddler [[Swaps Curve]]>>
* [[Yield Curve]]
** [[Blended Yield Curve]]
The tiddlers in the left and tags that are site categories
I was asked to compare what [[RateFeed]] posted vs what was in [[Quantum]] now. The process was simple, given the [[curves database]] was designed to make comparison easy. The following SQL shows how to find [[End of Day]] Rates in [[QTProduction]] (via ''rate_type'') and the [[curves database]] (via the ''eodCurves'' table).
; QTProduction
{{{
select name, rate_dt, timeband, bid, offer
from intrates
where name like 'Yield Curve%' and ccy = 'AUD'
and (rate_dt = '2012-06-29' or rate_dt = '2012-06-30')
and ratetype = 'End of Day'
order by rate_dt, name, datedays
}}}
; Curves database
{{{
select name, rate_dt, timeband, bid, offer
from intrates
join eodCurves on eodCurves.histCurveId = intrates.histCurveId
where name like 'Yield Curve%' and ccy = 'AUD'
and (rate_dt = '2012-06-29 00:00:00' or rate_dt = '2012-06-30 00:00:00')
order by rate_dt, name, datedays
}}}
@@background-color:#AAF;''Note'': The rates in [[Quantum]] are rounded as per the currency record. The [[RateFeed]] does no such rounding. I needed to use the Excel round function to eliminate these differences.@@
!! References
* [[curves database]]
Articles about configuration
The DataScopeRateFeed expects to find its configuration files in {{{k:\treasury\rate feed\configuration}}}, or in the parameter for the application. If they are not there, the application will not start.
The DataScopeRateFeed will read every file with the extension of ".ini" in that folder and parse it. User's may split the configuration into as many files they want.
@@background-color:#AAF;''Note'': After the DataScopeRateFeed starts, it shows the "last update" time of the most recently edited configuration file, and the number of configuration files parsed.@@
!! Usage
The current configuration files are on {{{\\oakrmsmscs.oakrms.local\iRMS\Treasury\Rate Feed\Configuration}}}. They have been committed to the [[Bazaar]] repository. They are used as follows:
; Current configuration files
|! File name |! Usage |! Type |
|Site.ini |contains the [Mode], [Clone] and [Paths] sections. These sections are the only ones that need to be different between a clone and the live DataScopeRateFeed. Thus, as changes are committed on production, every file except this can be ([[Bazaar]]) updated in the clone folders. |Site specific |
|Formulae.ini |contains the [[RateFeed Formula]] functions used through the curve definitions |General |
|General.ini |contains the remainder of the non-curve configuration. In particular, it has the schedules and database parameters |General |
|FX Curves.ini |contains the definitions of all FX curves and Precious Metal spots |Curves |
|Commodities.ini |contains the definition of Base Metal forwards and volatilities, Iron Ore, Gas Oil and ~WTI-Nymex Oil (ECW) forwards |Curves |
|Coal.ini |contains the Coal stepped curve, and a formula for correctly selecting maturity date |Curves |
|Fixings.ini |contains the fixing rates used by [[Operations]] to rate set deals |Curves |
|Yield Curves.ini |contains the [[Blended Yield Curve]] definitions. Note that many yield curves include curves from "Fixings.ini" as the [[Cash Curve]] component) |Curves |
|Margins.ini |contains the margins (from the [[Manual RateFeed]]) that are used with the investment curves below |Curves |
|~ReferenceRates.ini |contains the investment curves (Bank Yield Curves and Commercial Yield Curves) |Curves |
|Bonds.ini |contains [[Bonds]] and [[Income Securities|Income Security]] |Curves |
|Volatilities.ini |contains the volatilities of the Interest rate Options, Currency Options and Swaptions. The Base Metal volatilities are in "Commodities.ini" |Curves |
!! Format
The files are in the ini format of:
{{{
[section]
key1=value
key2=value
;key3=value
...
}}}
Further,
* each section must be unique across all configuration files.
* capitalisation is irrelevant for the keys
* spaces between the key and value are ignored
* Any line starting with a semi-colon is considered to be a comment and ignored.
Note the following common forms:
|! short form |! meaning |
|on |Overnight |
|sp |Spot |
|sw |Spot Week |
|''n''d |''n'' DAYS (e.g. 215d ⇒ 215 DAYS) |
|1w |1 Week |
|''n''m |''n'' MONTHS (e.g. 1m ⇒ 1 MONTH, 9m ⇒ 9 MONTHS) |
|''n''q |''n'' QUARTERS (unused) |
|''n''y |''n'' YEARS (e.g. 1m ⇒ 1 YEAR, 15m ⇒ 15 YEARS) |
!!! General
The '''general''' section contains top level configuration
{{{
[General]
; Log level sets the level of detail that is logged.
; 0: Errors only
; 1: Terse logging
; 3: Detailed logging
; 5: Verbose logging
logLevel=2
; The number of days of no updates before a RIC becomes stale (and removed from current rates
maxRicAge = 31
; Time at wwhich every curve is rebuilt. This clears the "Configuration" health bar and impacts
; every curve with an effective date of "Today" or "Yesterday"
morningBuild=3:00
; The warning and error limits is the number of sequential failed attempts to open a database before reporting a warning/error
; Note this can be overridden at the database level
warningLimit=3
errorLimit=20
}}}
|! Parameter |! Meaning |
|logLevel |a number from zero (0) to five (5):<br>..: ''0'': Errors only<br>..: ''1'': Terse logging<br>..: ''3'': Detailed logging<br>..: ''5'': Verbose logging |
|maxRicAge |The number of days after the last update that a RIC (on the pool database) will be considered stale and removed. For example, it a RIC is removed from the [[Manual RateFeed]], its last rate will remain available for formulae until this period has passed. Only then will any formula using it report an error.<br><br>The value here should be at least a month as the manual rates are updated at least monthly. It is hard to say what the upper value should be, but perhaps 50 days gives a good balance. |
|morningBuild |The time for the morning build of all rates. This is used to clear any "Configuration" health warnings from yesterday and to freshen every curve. As a consequence, all curves with an effective date of "Today" or "Yesterday" will be updated at this time, and the QuantumRateFeed will apply the updates. |
|warningLimit |This is the default number of failed attempts to open a resource, such as a database before either a warning message or an error message is written to the log. Further, the errorLimit indicates how many sequential failures are required to get a "dead" health bar.<br><br>These limits can be overridden in other sections. |
|errorLimit |~|
!!! Mode
{{{
[Mode]
; Mode needs to be one of Production,Clone and Test
; Production: Read from DataScope and Manual RateFeed
; Clone: Copy new Pool data rows from production to our pool data
; Test: Read RIC data from folder and Manual RateFeed
mode=Live
}}}
|! Parameter |! Meaning |
|mode |The [[application mode|DataScopeRateFeed]]. One of:<br>* Live<br>* Test<br>* Clone |
!!! Curve Defaults
There may be a default section for each ''curveType'' as defined in Curve definition below. Any definition that is set in the curve default for a type can be assumed in all other curves of that type.
{{{
[Yield Defaults]
RateName=Yield Curve
shortHeading=%ccy%
QTOM Fields=Protocol,Input Type,Days Convention, Days Offset
Days Convention=USE CURRENCY
Days Offset=0
Input Type=Yield
Protocol=ActiveX
}}}
!!! Curve definition
Most of the sections in the configuration files are curve definitions. A section is a curve definition if it has a parameter "curveType"
{{{
[USD/INR Foreign exchange]
curveType=FX
ccy=INR
EOD=PM
EffectiveDate=Today
tb_sp= pool(Bid Price) || pool(Ask Price) || INR= || days: spotDays()
;Interpolate the Spotweek while Justin ponders
;tb_sw= pool(Bid Price) / 100 || pool(Ask Price) / 100 || INRSW= || days: 7
tb_sw= interpolate() || interpolate() || INRSW= || days: 7
tb_1m= pool(Bid Price) / 100 || pool(Ask Price) / 100 || INR1M= || days: spotDays(months=1)
tb_2m= ditto() || ditto() || INR2M= || days: spotDays(months=2)
tb_3m= ditto() || ditto() || INR3M= || days: spotDays(months=3)
tb_6m= ditto() || ditto() || INR6M= || days: spotDays(months=6)
tb_9m= ditto() || ditto() || INR9M= || days: spotDays(months=9)
tb_1y= ditto() || ditto() || INR1Y= || days: spotDays(years=1)
snaps=FX Forwards,Spots 1,Spots 2
}}}
|! Parameter |! Meaning |
|''section name'' |The name of the curve, as used in the curve data, and all formulae. |
|curveType |One of "FX", "Yield", "Volatility"<br><br>There are sections called "FX Defaults", "Yield Defaults" and "Volatility Defaults" that contain default values to use for any of the parameters below. They are applied according to this parameter. |
|ccy |The currency of the curve |
|ccy2 |''Optional:'' FX Option Volatilities pertain to a currency pair. "ccy2" is used to indicate the second currency. No other curves use this parameter. |
|EOD |The EOD definition that describes when the curve is to be flagged as [[End of Day]]. |
|shortHeading |A short form heading (may not be used any more) |
|~EffectiveDate |A [[RateFeed Formula]] that calculates the effective date.<br><br>It may be "Today" or "Yesterday" and have that effect (according to local time). |
|permitZeros |''Optional:'' By default false, if given the value "True" or 1, curves with zeros in bid or offer are accepted. Otherwise they are rejected. See also ''~MinimumValue'' below. |
|permitNegatives |''Optional:'' By default false, if given the value "True" or 1, curves with negative rates in bid or offer are accepted. Otherwise they are rejected. See also ''~MinimumValue'' below. |
|~MinimumValue |''Optional:'' A rate (as a floating point number, eg 0.0001) <br><br>As an alternative to ''~PermitZeros/~PermitNegatives'' (described above), you may set a minimum value for every bid and offer in the curve. The substitution is made without logging. (If not present, the minimum value is minus max float). It is designed specifically to avoid calculated yield curves going negative or being rejected.<br>@@background-color:#AAF;''Note'': If you want to apply a minimum value to some points but not others, put the formulae needing a minimum into a<br>{{{max((0.0001,....))}}}<br>where 0.0001 is the desired minimum value and ... is the formula.@@ |
|continueOnError |''Optional:'' By default false, if given the value "True" or 1, curves with error timebands are used, just without those timebands. Otherwise they are rejected. This feature is particularly intended for Bonds. |
|snaps |''Optional:'' A comma separated list of snap definitions that indicates when this curve is required.<br><br>DataScopeRateFeed uses the [[RIC]]s and fields defined in this curve, along with this parameter to build a list of expected instruments in each snap. This list can be used to import an instrument list into [[DataScope]] (much more quickly than can be done by hand.)<br><br>@@background-color:#AAF;''Note'': Irrelevant of this setting, this curve will be rebuilt whenever any of the [[RIC]]s it depends on are updated.@@ |
|tb_ __timeband code__ |Defines a timeband for curve. The code must be a valid code (as shown above) or a pair of valid codes in the case of Swaptions. Note that the actual time band used for the curve can be overridden by adding a [[RateFeed Formula]].<br><br>The value is a set of [[RateFeed Formula]]e separated by double bars ({{{||}}}) much like tables in this wiki. Each formula is preceded by a field name and a colon. However, the first three formula are assumed to be the bid, offer and ric, unless otherwise specified.<br><br>Consider the following:<br><br>''Use the "Last Price" from "~AU1MBA=" for the bid and offer''<br>{{{tb_1m = pool(Last Price) || pool(Last Price) || AU1MBA= || days: 31 }}}<br><br>''There is no RIC for this timeband, so interpolate from those that do have one''<br>{{{tb_4m = interpolate() || ditto() || basis: 'Cash' || days: 122 }}}<br><br>''Use the "primary activity" of "GBPONFSR=" for the bid and set the offer to the bid''<br>{{{tb_on = pool(Primary Activity) || curve(,,Bid) || GBPONFSR= || basis: 'Cash' || days: 1}}}<br><br>''Take the rates from the [[Manual RateFeed]], but overwrite the timeband''<br>{{{tb_5m = pool(Bid Price) || curve(,,Bid) || MRF!WATC 5M || basis: 'Quarterly' || date: pool(maturity date) || timeband: 'ACTUAL'}}}<br><br>''Use a more complex formula for the bid, use copy down for the offer and calculate the timeband''<br>{{{tb_1m= pool(Previous Close Price)-pool(Previous Close Price,CLc1) || ditto() || CLc1 || days: (pool(Expiration Date) - curve(,,rate_dt)).days || timeband: '%d DAYS' % curve(,,fwd_days)}}}<br><br>@@''Warning'': The ric in this value is not actually a formula; it must be a string of characters that pertain to a [[DataScope]] or [[Manual RateFeed]] [[RIC]]@@ |
|security__suffix__ |Defines a security for this bond collection. <br><br>The value is a set of [[RateFeed Formula]]e separated by double bars ({{{||}}}) much like tables in this wiki. Each formula is preceded by a field name and a colon. However, the first three formulae are assumed to be the bid, security and ric, unless otherwise specified. <br><br>Consider the following:<br>''Use the mid yield from the given RIC''<br>{{{security42= (pool(Bid Yield) + pool(Ask Yield))/2 || Inv Bond WBCS0312 || AU3CB0107241=R}}}<br><br>''Calculate the yield from another bond''<br>{{{security81= bond(Inv Bond CGOA0415) + 0.70 || Inv Bond APAW1215}}} |
|QTOM Fields |Defines fields are are needed to insert the given curve into [[QTOM]], but are not otherwise relevant for the curve building. <br><br>The fields are presented in a comma separated list and are capitalised as Quantum wishes them to be. For each field in this list, there must be a definition with the same name.<br><br>In the example below (for AUD Bonds), as the ''QTOM Fields'' is set to "Input Type", there must a an ''Input Type'' definition.<br>{{{QTOM Fields=Input Type}}}<br>{{{Input Type=Yield}}}<br><br>The QTOM Fields and their values are simply put into the "qtomFields" table of the [[curves database]] for use by the QuantumRateFeed. |
|//__QTOM Fields__// |~|
|writeToQuantum |One of "Yes", "No", "Live", or "End of Day"<br><br>If not present, "Yes" is the default. If either of "Live" or "End of Day", only that rate type is written to Quantum. |
|stepped |Only set if this is a stepped curve. One of "Forward" or "Backward". See [[How to build a stepped coal curve|Build a stepped coal curve]].<br>[img[http://farm3.staticflickr.com/2860/12603297933_63f817352b_b.jpg][http://www.flickr.com/photos/64724523@N03/12603297933/]] |
|requiredRICs |These settings are optional, and are only available to curves that have been assigned to a snap.<br><br>Sometimes the formulae used to calculate the curve is too complex to be able to be parsed by DataScopeRateFeed to determine which [[RIC]]S, Curves and Fields are needed to build the current curve. In such a case, you may flag the dependency here. This means that when any of those ~RICs or Curves are updated, this curve will be rebuilt.<br>{{{; We may want to use these ~RICs in future, so we will capture them for a time}}}<br>{{{RequiredRics=NCFMc2,NCFMc3,NCFMc4,NCFMc5,NCFMc6}}}<br><br>@@background-color:#AAF;''Note'': The ''requiredFields'' is for future use.@@ |
|requiredCurves |~|
|requiredFields |~|
!!! Snap definitions
Any section with a "snapTiming" parameter is a snap definition.
{{{
[Comm Int Rates]
dataSources=DataScope
snapTiming=10:00
days=Tue-Sat
[FX Forwards]
dataSources=DataScope
snapTiming=10:00,16:20
RICpattern=(?![A-Z]{3}=)
}}}
|! Parameter |! Meaning |
|dataSources |A comma separated list of data sources that can be snapped. At present "DataScope" is the only valid value. |
|snapTiming |A comma separated list of times when the snaps are available on DataScope. At least one time must be provided. |
|days |''Optional:'' The days that the snap is available. Days are in terms of three letter codes in ranges, or comma separated. By default, this is "~Sun-Sat". |
|~RICpattern |''Optional:'' A regular expression used to restrict the ~RICs that belong to the snap.<br><br>Generally leave these to MauriceManeschi to set. However, the following is a valid pattern:<br>{{{RICpattern=(EUR=|EURSW=|EUR1M=|AUD=|NZD=)}}} |
!!! Paths
These are the locations of resources that do not require other parameters
{{{
[Paths]
LogFile=k:\treasury\rate feed\Log Files\DataScope RateFeed.log
Downloads=k:\treasury\rate feed\Downloads
Pool data=k:\treasury\rate feed\pool.sqlite
Curve data=k:\treasury\rate feed\curves.sqlite
ManualRateFeed=k:\treasury\rate feed\Manual\Manual Input.xlsx
Test Mode Input=k:\treasury\rate feed\Test Mode Input
Test Mode Processed=k:\treasury\rate feed\Test Mode Processed
}}}
|! Parameter |! Meaning |
|~LogFile |Path of the log file |
|Downloads |The folder to write all files downloaded from [[Reuters Datascope Select]] |
|Pool data |Path of the [[pool database]] |
|Curve data |Path of the [[curves database]] |
|~ManualRateFeed |Path of the [[Manual RateFeed]] |
|Test Mode Input |Path to find DataScope files when running in test mode |
|Test Mode Processed |Path where test mode file are placed after being processed |
!!! SQL database
There are current two [[SQL]] database sections in the configuration: "Pool data" and "Curve data"
{{{
[Pool data]
databaseType=sqlite
;username=unused
;password=
; Number of days to purge rates after
purgeAfter=90
}}}
|! Parameter |! Meaning |
|databaseType |Currently only "sqlite" is supported. Eventually we may allow "sqlserver" and others |
|username |Database user name and password, not required for "sqlite" |
|password |~|
|purgeAfter |The tables in these databases have all their rates date stamped. Rates older than this number of days are permanently deleted. |
!!! EOD Timings
These sections determine when to mark the most current curve as the EOD curve. The section labels here correspond to the ''EOD'' parameter of the //Curve Description// above.
{{{
[Morning]
timing=AM
time=10:25
[Afternoon]
timing=PM
time=16:40
[Bonds EOD]
timing=Bonds EOD
time=17:00
}}}
|! Parameter |! Meaning |
|timing |Any section with a ''timing'' parameter is presumed to be an '''EOD Timing''' section. The value is referred to in curve definitions above under the parameter "EOD". |
|time |<br> __hour__:__minute__<br>The time when the current curves (assigned to this end of day) are to be marked as [[End of Day]]. |
!!! DataScope
This section defines how we connect to DataScope.
{{{
[DataScope]
dataSourcePrefix=
type=ftp
server=hosted.datascope.reuters.com
username=r157957
password=************
version=5.2.4.29882 (29882)
ignore files older than=7
}}}
|! Parameter |! Meaning |
|dataSourcePrefix |A string of characters that prefixes the "rics" for this data source. In this case, it should always be blank. |
|type |The type of data source. In this case, it should always be "ftp" |
|username |The [[FTP]] user name (as provided by [[Reuters]]).<br><br>@@''Warning'': The DataScope FTP User Guide warns that the [[FTP]] user name differs from the web user name by letter "r" in the front.@@ |
|password |The [[FTP]] password (as provided by [[Reuters]]). |
|version |The version string from the start of the ".notes.txt" files that are created by [[DataScope]]. This string is used to flag a warning if the actual version downloaded is different. |
|ignore files older than |Any files found on [[DataScope]] that are older than this many days are ignored.<br>This parameter only has an impact if the pool data is cleared, probably in the event of a software upgrade or an aggressive bug fix |
!!! ~ManualRateFeed
[>img[http://farm8.staticflickr.com/7446/12603483055_0433384b6b.jpg][http://www.flickr.com/photos/64724523@N03/12603483055/]]
{{{
[ManualRateFeed]
dataSourcePrefix=MRF
type=excel
RicSheets=Instruments
}}}
|! Parameter |! Meaning |
|dataSourcePrefix |A string of characters that prefixes the "rics" for this data source. In this case, it should always be "MRF!". |
|type |The type of data source. In this case, it should always be "excel" |
|~RicSheets |A comma separated list of sheets from which the rates will be read.<br><br>The first row of each sheet (until there is a blank) are the names of the fields. Every subsequent row (until there is a gap) has the actual rates. |
!!! Formulae
Each effective date and the fields of a timeband/security element of a [[curve definition|#Curve definitions]] are all [[RateFeed Formula]]e. Some of these formulae are too complex for one line, so need to be moved into a function (that is, a label that takes a series of arguments and returns a value). These functions can be defined in the configuration files.
{{{
[curve]
formulaType=builtin
description=Return the rate for the given field of the given timeband of the given curve
argument1=curve
argument2=timeband
argument3=field
[chain]
formulaType=explicit
description=Returns the non zero pool value of the first working field in the list
argument1=fields
line1 =|fields = fields.split(',')
line2 =|for field in fields:
line3 =| try:
line4 =| value = pool(field)
line5 =| if value: return value
line6 =| except:
line7 =| continue
line8 =|return 0.00
}}}
There are two types of function: builtin and explicit. The builtin functions are key components of the DataScopeRateFeed so do not need declaration here at all, but I do so for the purpose of providing a simple reference. Explicit functions contain all the code as [[RateFeed Formula]]e.
|! Parameter |! Meaning |
|''section name'' |Name of the function that will be available later |
|argument''n'' |The arguments of the function such that:<br> __sectionName__(__argument1__, __argument2__, ...) <br>is how the function can use used in other [[RateFeed Formula]]e |
|formulaType |One of "builtin" or "explicit" |
|module |''Optional'': Only if the formulaType is "builtin". If provided, the function is a [[plugin|RateFeed Formula]]. This means that the module name is imported into the DataScopeRateFeed application and its "run" method called with any parameters provided. |
|description__//suffix//__ |Text explaining what the formula does |
|line''order'' |''Optional'': Only if the formulaType is "explicit". A line of [[RateFeed Formula]] following a "{{{|}}}".<br><br>''order'' is a floating point number (e.g. 1, 2.5, 8.0001). The lines are ordered by the suffix (in number order)<br><br>@@background-color:#AAF;''Note'': The code in these lines is compiled as part of the configuration process of the DataScopeRateFeed. If there are syntax errors, the application will stop and wait until you fix them. If there are coding errors, you will notice that the "Configuration" health bar is set to sick, or the "Software" bar is set to sick if it collapses badly.@@<br><br>@@''Warning'': There is no debugging facility built into the DataScopeRateFeed. Test your function fully in [[Python]] on a [[Test Server]] before using it in production.@@ |
!!! Clone
The '''Clone''' section defines how this DataScopeRateFeed would connect to another DataScopeRateFeed instance when it is running in ''Clone mode'', and which port this DataScopeRateFeed should listen to for other clones. This port is also used for reporting.
{{{
[Clone]
; The production pool data file is only used if mode = Clone
server=oakrmsctx02.rms.local
port=8333
; How many minutes between checks of the server
frequency=1
}}}
|! Parameter |! Meaning |
|server |The server to clone from. Generally, this should be your production server |
|port |The TCP port to clone over, generally 8333. There may be two ports here (comma separated), in which case it means read from the first port and respond on the second. (This is only needed if you want to have two [[DataScopeRateFeed]]s running on the same server.) |
|frequency |How often (in minutes) the clone server should check for new rates. |
!! Design
@@''Warning'': The configuration files are not implemented as the DataScopeRateFeed designer (MauriceManeschi) would have liked. They are overly repetitive, long and contain redundant information. The format documented here represents a negotiated compromise between requirements for easy to read tables and clean design. However, with careful management, it is sufficient for the job.
During the design stage, there was a plan to provide an editor for the configuration files to achieve the "easy to read" requirement.@@
The configuration file uses the ''ini'' file format as it is well supported by [[Python]] (via the '~ConfigParser''):
* The ini format requires the keys to be unique, hence the various suffixing of key behaviour (e.g. security45=)
* The configuration process does ensure that each section is only defined once
* All keys are converted to lower case. Sections are not. So references to these generally use a lower case function.
* The defaulting function of ''~ConfigParser'' is too generic to be useful, so it not used. shortHeadings are allowed to have "%ccy%' converted to currency, but nothing else is supported at present.
* Curve definitions will complain if they get extra parameters (as this likely means the user has put in a typo, and it may not be immediately obvious what they have done). All other sections ignore redundant parameters
!! Business Decisions
* The old [[RateFeed]] used the fields bid and ask for several instruments when calculating the curves bid and offer. Those fields were not available in [[Reuters Datascope Select]]. After some analysis, MauriceManeschi decided to use "Universal Bid Price" and "Universal Ask Price" in their stead.
* The old [[RateFeed]] hard coded the scaling factor for the [[FX]] forward points in Excel. [[Reuters Datascope Select]] provides a "Cross Scaling Factor" which contains the appropriate number to divide by. @@''Warning'': Contributed [[RIC]]s have "1E+00" in this field, which is rubbish. The factor is hard corded in the configuration files for the [[FX]] curves THB, THD, and BRL.@@
!! References
* DataScopeRateFeed
This web site and the software in http://launchpad.net/dsratefeed are maintained by MauriceManeschi.
He can be contacted at:
* dsrf at redwaratah dot com
* http://www.redwaratah.com
A Cross Currency basis margin is a representation of the liquidity and credit quality impact on transferring monies between different currencies in the future. It is expressed for each currency against the USD as a number of basis points above the yield swap curve for a series of time bands. It is used by banks when dealing a Cross Currency Swap, and hence impacts the fair value of such a deal.
Note that the Cross Currency basis margin is only applied to the fair value of Cross Currency Swap. It is not used for Foreign Exchange or Currency option deals, nor for Interest Rate Swaps in a foreign currency.
I discovered an unsupported way to implement Cross Currency basis margins in [[Quantum]]. The RiskRateFeed replicates these into [[Risk]]. [[Contact Us]] if you want to know more.
!! References
* RiskRateFeed
Defn: a margin used when valuing Cross Currency Interest Rate Swaps
<<tiddler [[Reuters Datascope Select]]>>
<<tiddler DataScopeRateFeed>>
<<tiddler [[Configuration Files]]>>
[>img[http://farm4.staticflickr.com/3747/12581848004_846db4dce9_n.jpg][http://flic.kr/p/kaPg4S]]
DataScopeRateFeed is a key component of the [[RateFeed]] Product. It:
* captures rates from [[Reuters Datascope Select]] and the [[Manual RateFeed]]
* builds these rates into curves (in the [[curves database]])
* sets the [[End of Day]] curves
The DataScopeRateFeed has been implemented as a [[Python]] application that is largely configured from a series of [[Configuration Files]]. It runs on a Windows session on the production server as a window and uses the Acorn internet connection to access rates from the [[Reuters Datascope Select]] [[FTP]] site. It writes its rates and curves to the [[pool database]] and [[curves database]] respectively. The DataScopeRateFeed is not intended to be used interactively; just restarted as the [[Configuration Files]] are changed.
The [[Python]] files are found in http://launchpad.net/dsratefeed.
!! Usage
To run DataScopeRateFeed, double click the "DataScope RateFeed" icon on the production server under the [[RateFeed user]]. The first thing the application does is check its [[Configuration files|DataScope RateFeed/Configuration]]. If there is a problem with these, it will stop and pop up an error message right away. The error message has a {{btn title{Retry}}} button if the application believes you could resolve the problem by editing the [[Configuration Files]]. Once the window pops up, check that the health bars are green (that you expect to be green) and that the application mode is "Live".
The DataScopeRateFeed can be started even if all its dependencies are not available. (For example, the [[Manual RateFeed]] or the [[FTP]] site.) If the configuration file is good, the application will start and wait for the resource to become available. When it does, you will notice the appropriate health bar become green.
Once the DataScopeRateFeed starts, it should not stop until {{btn title{Stop rateFeed}}} is clicked. You should only want to stop it for reconfiguration or system maintenance
!!! Application Mode
The DataScopeRateFeed can run in one of three application modes:
: //Live// is the standard mode where rates are read from [[Reuters Datascope Select]] and the [[Manual RateFeed]]
: //Test// where rates are read from csv files (formatted exactly as the [[Reuters Datascope Select]] extract files are formatted) in and folder and the [[Manual RateFeed]]
: //Clone// where rates are read from another DataScopeRateFeed instance, presumably running in live mode on a production server. This instance is the source of both both [[Reuters]] and [[Manual RateFeed]] rates)
Generally, there should only be one //Live// instance of DataScopeRateFeed. (During testing we have run two live instances accessing the same [[FTP]] site without issue. However, this should not assumed to be safe.)
The //Test// mode was designed for testing the curve building; by putting specific rates in the CSV files you can confirm that the curves appearing in the other sheets are correctly calculated. However, it has become a part of the [[disaster recovery process|Run DataScope RateFeed when the network is down]]. Note that when running in //Test// mode, file names ending with ".partial.csv" and ".ric.csv", or none CSV files are ignored.
The //Clone// mode is for testing new curves and definitions on a [[Test Server]]. You can collect extra rates by defining a one off snap in [[Reuters Datascope Select]] and then tell the production DataScopeRateFeed to {{btn title{Check Datascope}}}. You can then test various calculations on the test server by altering a //Clone// configuration. (For example, if you are unsure which of ''bid price'' or ''universal bid price'' is most suitable, try each for a week on different clones and compare the curves.)
!!! Trouble shooting
To see recent errors on the DataScopeRateFeed, click {{btn title{Show log}}}.
!!! Reconfiguration
Major reconfigurations should be tried out on a //Clone// DataScopeRateFeed running on a [[Test Server]] before applying them in production. To reconfigure the DataScopeRateFeed:
# Edit the appropriate [[Configuration Files]]
# Stop and start the DataScopeRateFeed (or the Export Utility, described below)
#* If there are any fatal errors in the configuration files, you will be prompted immediately to fix them.
# Confirm the configuration is OK (the "Configuration" Health Bar should be green after DataScopeRateFeed has finished the initial build)
# Commit the changed configuration files to [[Bazaar]]
@@''Warning'': Do not stop and restart DataScopeRateFeed around an "End of Day" schedule. (It will prompt if within 15 minutes.) If you miss the schedule because you had a faulty configuration file, you will need to manually enter [[End of Day Rates]] for that day.@@
!!! Export Utility
There is an [[RateFeed Export Utility]] as part of the DataScopeRateFeed. It is used to export curve definitions, build technical documentation and export instrument lists.
!!! Maintenance
All maintenance for the DataScopeRateFeed is automated.
!! Configuration
There are a number of configuration files that define what the DataScopeRateFeed does and how it does it. These include:
* Locations of databases and files
* Application Mode to run under
* Where to source rates ([[DataScope RateFeed]] and [[Manual RateFeed]])
* When to collect rates from DataScope
* How to build curves from [[RIC]]s
** Formulae to use in calculating curves
* When to set [[End of Day]] rates
The configuration is described in detail in [[DataScope RateFeed Configuration|Configuration Files]]
!! Design Notes
Detailed design information is found in [[DataScope RateFeed Technical Description|Technical Notes]]
!! Business Decisions
* See Business Decisions in [[Configuration Files]] for behaviours in the DataScopeRateFeed that have been implemented in the configuration files.
* In the old [[RateFeed]], the ''Reuters RateFeed'' would capture rates and the QuantumRateFeed would flag [[End of Day (Rate type)]]. The setting of [[End of Day]] has been moved to DataScopeRateFeed and is configured in the EOD Timing section of the [[Configuration Files]]. This centralises the configuration to one application and simplifies QuantumRateFeed to just write what is in the [[curves database]]. (The old QuantumRateFeed needed 17 workbooks with custom code in each and special scheduling routines.)
* At present the DataScopeRateFeed runs as a [[Python]]-based script on the production server. This is because the firewall does not allow it to run in the DMZ (as it neither allows FTP access, nor internal connection via port 8333).
** [[I|MauriceManeschi]] still intend to move it over once the firewall is reconfigured.
** There is no implication of running outside the DMZ (except that the design documentation is at odds with the implementation).
!! References
[[RateFeed]]
* [[How to set up DataScope RateFeed|Set up DataScope RateFeed]]
* The QuantumRateFeed reads curves from the [[curves database]], as built by the DataScopeRateFeed, and writes them to [[Quantum]].
* The DataScopeRateFeed was implemented under the auspices of {{TT|6640}}
* [[ArchiveRateFeedDownloads.py]] archives old download files once a month
{{design}}
The DataScopeRateFeed is a complete replacement for the Reuters RateFeed. Its purpose is the collect rates from [[DataScope]] and the [[Manual RateFeed]] and build curves according to a set of rules defined in a configuration file. The curves are output into a database, to allow consumption by the QuantumRateFeed.
!! Application Modes
The DataScopeRateFeed has three modes:
# Production
#* Rates data is read from [[DataScope]] and [[Manual RateFeed]]
#* The pool data is populated with the results
# Clone
#* The pool data is routinely snapped from a (production) instance of the DataScopeRateFeed
#* This allows testing of the [[RateFeed]] on a test server
# Test
#* Rates data is read CSV files in a designated folder and the [[Manual RateFeed]]
#* The pool data is populated with the results
#* This allows for specific testing of the [[RateFeed]] and a "no FTP" disaster recovery
All modes build curves from the pool data when it updated. These curves are stored in the database.
!! Databases
The DataScopeRateFeed has four databases:
# A simple log file for recording events
#* This is intended to be a fail-safe in the event that some serious error stops the RateFeed
# A folder for dropping FTP downloads in
#* I don't want this, but Andrew has requested a copy of all downloads
#* Requires FTP files to be date and time stamped.
#* Archiving will be manual at present
# Pool data for storing captures
#* This database will be a data capsule. In clone mode, the DataScopeRateFeed will grab this whole file from production and use the copy as its own
## data table for all [[RIC]] and [[Manual RateFeed]] snaps
##* Keep a months worth of updates, most recent is flagged
##* This means that a RIC that does not get written in a month will be deleted.
## snapshot table records when each file was captured (or when a manual input update was taken)
##* Will include number of ~RICs snapped. Intended for statistical analysis
## snapshot_events table records missing and not permissioned ~RICs
##* linked to snapshot table
# Curve data for stored curves
# fxrates, fwdpts - just like [[QTProduction]]
# intrates - just like [[QTProduction]]
# volhdr and volrates - just like [[QTProduction]]
# log table will have system logs
#* This will be the properly indexed form of the simple log file above
#* All events are written to it
#* Probably archived off monthly
Initially, the latter two databases will be implemented in SQLite. They could be moved to SQL server later if required.
When the DataScopeRateFeed is inactive (most of the time it will be waiting for the next snap) all these databases will be closed. This should allow the application to survive various combinations of network problems/system reboots without hanging. This does mean that when the application wakes up to do something, if it can't get all its databases back, it will (attempt) to note the reason in the simple log file and go back to sleep for another minute. This should allow it to survive a variety of hostile situations.
!! Configuration
The goal of the configuration facility of the DataScopeRateFeed is to have:
* every curve fully defined in the configuration file
** This is not possible for some curves, such as the Coal Price Index
* easy to maintain and understand
* eventually automated
* eliminate redundancies - (for example, the AUD yield curve definition describes the rate capture, curve building and writing to [[Quantum]])
The DataScopeRateFeed will have a module for reading the configuration file. It will read it once - you must restart the application to re-read the file. If there are any problems in the file, the application will not run at all; It will pop up an error window and stop. (This means that no-one should make a change without testing.)
The configuration file will start off as a set of INI files. Having several files keeps it clear what is happening in each file. Eventually, the format will be reviewed to see if there is a better configuration file format. (For example, Microsoft Visual Studio uses XML.)
The configuration module will be able to generate a set of [[DataScope]] instrument lists and reports. Initially, this will be in the form of CSV files that can be updated into [[DataScope]]. I envisage that this will be replaced by a fully automated facility
!! User Interface
[[RMS]] have become accustomed to looking at the A1:A12 cells of Rate Feed Live.xls to determine that the RateFeed us running. The aim of the new user interface is to let [[RMS]] have the same "one glance, all OK" experience. The DataScopeRateFeed will implement this via the intranet based dashboard. This greatly simplifies the application window and removes the need for a RateFeed PC.
!!! Application Window
The DataScopeRateFeed will be implemented as a Windows application running on a Windows server. (Incidentally, no Microsoft Windows specific functionality will be used.) It will be a very simple window that shows:
* Start time
* Application mode
* Last [[DataScope]] update time
** In test mode, this will be "N/A"
* Last [[Manual RateFeed]] update time
* Last Curve update tome
* A close button
* A "go look on DataScope right now" button
!!! Dashboard
The dashboard will run from the Operational Server as a thin client.
* It will check the production pool data and curve databases and copy them over when they update
* It will read and display the following statistics from the databases
** Current spot rate and yield curve for a few currencies
** Scrollable list of the last curves written
** Hours, minutes and seconds to the next snap and next EOD event
** Date time of the last Manual RateFeed update
** Date time of the last Configuration file update
** Health bar/dial - click to see
*** Recent errors
*** Number of curves
*** curve ages (perhaps a histogram with hover)
** It may also have "show me a RIC" and "show me a curve"
The dashboard may be extended to include configuration file editing. This would require it to have a "Restart RateFeed" facility too.
!! Data collection
The configuration file defines a set of times in which data exports are expected to be on [[DataScope]]. It will also define the names of those files. The DataScopeRateFeed will only look at those times (when running in production mode). If it does not find the files, it will try again every minute for 20 minutes, after which it will write an error to the log and give up.
When the DataScopeRateFeed finds a file on the [[FTP]] server, it will check the notes and RIC maintenance files too. If it finds an embargo notice, it will take what is available, but reschedule itself to come back when the embargo is complete. If the embargo time does not correspond to the embargo setting in the [[Configuration Files]], DataScopeRateFeed reports an error, but still waits the designated amount of time.
The three files are downloaded (and written to the drop folder) and parsed of their data.
* If a RIC has data, it will create a new record in the data table, even if nothing has changed
* If the RIC has error codes in its data fields, the record will be written to the data table but flagged as invalid. The previous good values for the RIC will remain current.
** Thus if some of the curve data is invalid, it can still be instructed
** There will be a "fail if any data is invalid" option in the configuration file
When not in clone mode, the DataScopeRateFeed will check the [[Manual RateFeed]] every minute for updates. If it finds one, it will read all the data from the [[Manual RateFeed]] and write a new row in pool data for every rate that has changed or has not updated in a week. This will cause curves using this data to get rebuilt.
!! Curve Building
When there is an update in the data table of pool data (current records only), every impacted curve will be rebuilt. (This is defined in the configuration file. Note that as some curves are derived from other curves, this can cause a cascade of curve building.)
Curves are built according to rules defined in the configuration.
The QuantumRateFeed uses the curves database as a data source.
EOD rates are flagged in the curves database. The QuantumRateFeed just pushes through those rates when updated. (The no overwrite rule remains.) Month end rate validation will not be reflected in the curve database.
!! Alerts
The following events will cause an alert:
* Critical - ''The DataScopeRateFeed stops and waits for human intervention''
** The configuration file is unavailable or corrupt
* Error - ''Human intervention is needed''
** The pool data and/or curve data database is not available after five minutes of trying
** Access to the FTP server is not available after 5 minutes of trying
** A partial snap on the FTP server is not present after 10 minutes
** A full snap on the FTP server is not present after 10 minutes + the embargo time
** The [[Manual RateFeed]] is not available after an hour of trying
** Any RIC reported as not permitted in the DataScope notes file
** A RIC or field is missing when attempting to build a curve
** A calculation fails while building a curve
* Warning - ''Might get better, or might need intervention''
** A RIC is not the same as an Instrument for some [[DataScope]] data
** Any entry in a RIC Maintenance file
** The embargo time reported in the DataScope Notes file is different to the one in the configuration file
** The number of instruments in the total section of the DataScope Notes file is different to the number of ~RICs specified in in the configuration file
** A database or FTP site is not available in the second instance until it becomes an error event
* Informational - ''Of passing interest''
** The DataScopeRateFeed starts or stops
** A file is downloaded from [[DataScope]]
** A curve is built
** A database or FTP site is not available in the first instance
!! References
* DataScopeRateFeed
* QuantumRateFeed
The source code of the DataScopeRateFeed and related feeds is located on Launchpad under http://launchpad.net/dsratefeed.
To download it:
# Install Bazaar (http://bazaar.canonical.com/en/)
# Run {{{bzr branch lp:dsratefeed}}}
End of Day has two different meanings in the context of DataScopeRateFeed.
# For most curves in the [[curves database]], once a day the most current curve is flagged as "End of Day"
#* QuantumRateFeed writes these curves to [[Quantum]] as "End of Day"
#* OakvaleCapital had a policy that once a rate was written as "End of Day" it could not change without a formal correction procedure
#** This means that curves with an ~EffectiveDate taken from a RIC may not have "End of Day" curves for every calendar day
# The period of the day when these curves are so flagged
#* The times are set in the [[Configuration Files]]
The purpose of End of Day rates is to have a fixed set of rates for accounting. You can rerun financial reports for any date in the past and still get the same result.
!! Reference
* [[Rates]]
Defn: a flag for curves to indicate they are the final values for a date, or the period when that flagging happens
<<tiddler [[End of Day]]>>
<<tiddler [[End of Day]]>>
<<tiddler [[End of Day]]>>
[>img[http://farm8.staticflickr.com/7329/12626443055_d3baccba78.jpg][http://www.flickr.com/photos/64724523@N03/12626443055/]]
There is an //Export Utility// as part of the DataScopeRateFeed. It has its own icon on the desktop, but can also be accessed via the {{btn title{Export}}} button on the DataScopeRateFeed user interface.
It is implemented in [[Python]].
!! Usage
When you launch it, it reads the [[Configuration Files]] files and reports any errors with them. It lets you perform 3 exports:
# Curve Definitions
#* Creates Export.xlsx (described below)
# Technical Documentation
#* Creates the documentation files linked to [[Technical Notes]]
# Export instrument Lists
#* Creates an instrument list for each snap as a CSV file that can either be uploaded directly to [[Reuters Datascope Select]], or compared to an extraction.
!! Configuration
The RateFeed ExportUtility is run via a [[scheduled task|RateFeedExport.bat]]. It still pops up a window, so needs to be scheduled from a server with a logged on session (presumably [[RateFeed user]])
{{{
usage: ExportUtility.pyw [-h] [-c CONFIG] [-x XL] [-o OUTPUT] [-l LOG]
[action [action ...]]
Export utility for the RateFeed
positional arguments:
action Actions to automatically perform
optional arguments:
-h, --help show this help message and exit
-c CONFIG, --config CONFIG
Location of config file [default: K:\Treasury\Rate
Feed\Configuration]
-x XL, --xlfile XL Excel file file name (for curve exports) [default:
export.xlsx]
-o OUTPUT, --output OUTPUT
Path for exports [default: My Documents]
-l LOG, --log LOG File to write logging to [default: My Documents\
export.log]
}}}
where action is any abbreviation of:
* Curve Definition (see [[RateFeedExport.bat]])
* Technical Document
* Instrument List
!! Export.xlsx
[>img[http://farm3.staticflickr.com/2881/12626981944_1fd34d7cf0.jpg][http://www.flickr.com/photos/64724523@N03/12626981944/]]
The //RateFeed Export// is used to show how curves are built from the:
* [[RIC]]s and rates from the [[pool database]]
* [[RateFeed Formula]]e defined in the [[Configuration Files]]
It is laid out as follows:
# There is a tab for each group of curve based on the engine that built it.
# The first column (column B) shows the timeband definition. Generally, this is the same as the timeband in Quantum, but for some obtuse curves, the timeband is calculated (generally in the far right hand yellow column).
# @@bgcolor: #cff;The blue columns show the [[pool database]].@@ The pool is composed of [[Reuters]] data and [[Manual RateFeed]] data (without any calculations or formatting). The column names refer to the [[DataScope]] column names. [[Manual RateFeed]] data is prefixed with "MRF!". Some rows have no pool data; look at the calculation columns to understand why.
# @@bgcolor: #cfc;The green columns show the calculations@@ that were applied to build the curve. These are written in the [[Python]] language and contain the following [[functions|RateFeed Formula]]:
#* pool(field name, ric) = retrieve the field for the given ric from the [[pool database]]. If the ric is not provided, use the row ric (in column C)
#* curve(name,timeband,field) = retrieve the field for the given curve and timeband from the [[curves database]]. If the name, timeband or field are blank, use the current curve, timeband and field respectively
#* bond(name) = retrieve the yield from the given bond from the [[curves database]]
#* interpolate() = calculate the linear interpolation from the rates before and after based on the number of days for each
#* syntheticYield() = calculate the yield from the forward points and the USD yield using ACTUAL/360.
#* spotDays() = calculate the number of days after spot (avoiding weekends using modified following)
#* chain(list of fields) = return the first non zero value from the fields (that is, pool("field name")), checked in order
# @@bgcolor: #ffc;The yellow columns show the [[curves database]].@@This is the result of calculating the green columns (there will be a yellow value for every green formula, and vice versa). It is pretty much what will be inserted into Quantum (barring some rounding).
#* [[FX]] spot dates and days are always recalculated by [[Quantum]] as it understands holidays.
!! References
* DataScopeRateFeed
* [[RateFeedExport.bat]]
FRM were a division of OakvaleCapital that:
* Specified the [[Rates]] that were required and how there were used to build curves
* Defined which [[RICs|RIC]] were required
* Validated the rate curves, typically doe the [[Month End Rates]]
* Made decisions on remedial action in the event in incorrect or missing rates
Essentially, they were the managers of the RateFeed
! References
* OakvaleCapital
Defn: the managers of the RateFeed
FTP stands of File Transfer Protocol. DataScopeRateFeed uses FTP to access [[Reuters Datascope Select]].
The [[Python]] ftplib is sufficient for this purpose.
!!References
* http://en.wikipedia.org/wiki/File_Transfer_Protocol
Defn: the protocol DataScopeRateFeed uses to access [[Reuters Datascope Select]]
FX is a common abbreviate for "foreign exchange".
!! References
Defn: a common abbreviate for "foreign exchange".
Here is the Formula.ini file used at OakvaleCapital (as a guide to how to write and use [[RateFeed Formula]]e)
{{{
; ============================================================
; Formulae
;
; Every formula used is defined here
; ============================================================
[pool]
formulaType=builtin
description=Return the value of the given field for the given ric
argument2=ric
argument1=field
[curve]
formulaType=builtin
description1=Return the rate for the given field of the given timeband of the given curve. If you provide a value, that value will
description2=be written to the curve data. (This is used in evaluateLater() functions to write the result of the calculation.)
argument1=curve=None
argument2=timeband=None
argument3=field=None
argument4=value=None
[ditto]
formulaType=builtin
description=Perform the same calculated as per this field in the last timeband
argument1=timeband=None
[fwdPrice]
formulaType=builtin
description=Return the current forward price (spot plus the forward points) for the given currency and timeband
argument1=ccy
argument2=tb
[fwdYield]
formulaType=builtin
description=Return a tuple of the current forward yield and basis for the given currency and timeband
argument1=ccy
argument2=tb
[timebands]
formulaType=builtin
description=Return a list of the timebands (in short codes, like 'sp', '1m, '4y') for the given curve
argument1=curveName
[findCurve]
formulaType=builtin
description=Return the name of the curve that matches the given type (one of "fx" or "yield") and currency
argument1=ty
argument2=ccy
[bond]
formulaType=builtin
description=Return the current "Bid" of the given security from the curve data
argument1=sectype
[curveDetails]
formulaType=builtin
description=Return a dictionary of curve details for the given name. At present, this just returns the currency
argument1=curveName
[setCurveDependency]
formulaType=builtin
description1=Advise DataScope RateFeed that every time the dependsOn curve updates, the dependant curve must be recalculated
descritpion2=This is used for synthetic yield curves at present to flag that a synthetic curve needs to be rebuilt when either
descritpion3=the uSD Yield curve or the FX curve for the currency updates.
argument1=dependant
argument2=dependsOn
[interpolate]
formulaType=builtin
description=Flag this point to be calculated via interpolation after the rest of the values have been filled
; This formula takes no arguments. It flags the current field of the current timeband as needing interpolation
; Once the rest of the fields are filled, the formula finds the timeband before and the point after with an actual
; value. It then linearly interpolates on a 30/360 days basis. If there is no value before (or after), it takes the
; actual value of the nearest timeband (that it, it extrapolates flat of either end of the curve
module=interpolate
[spotDays]
formulaType=explicit
description=Calculate the days field for FX curves. This is either the days between the effective date and spot, or the date between
description2=the spot date and the timeband date (ACTUAL/ACTUAL)
line1 =|effectiveDate=curve("", "", "rate_dt")
line2 =|spotDays = 4 if effectiveDate.strftime('%a') in ('Thu','Fri') else 2
line2.1=|if Formula.currentElement == 'sp': return spotDays
line2.2=|if Formula.currentElement == 'sw': return 7
line2.3=|if Formula.currentElement[-1] == 'm':
line2.4=| months = int(Formula.currentElement[:-1])
line2.5=|elif Formula.currentElement[-1] == 'y':
line2.6=| months = int(Formula.currentElement[:-1]) * 12
line2.7=|else:
line2.8=| raise Formula.CalculationError('Unsupported timeband for spotDays(): %s' % Formula.currentElement)
line5 =|spotDate = effectiveDate + timedelta(days=spotDays)
line6 =|timebandTuple = [spotDate.year, spotDate.month, spotDate.day]
line7 =|years,timebandTuple[1] = divmod(spotDate.month + months - 1, 12)
line8 =|timebandTuple[0] += years
line9 =|timebandTuple[1] += 1 # subtracted 1 two lines up
line10 =|while True:
line11 =| try:
line12 =| return (datetime(*timebandTuple) - effectiveDate).days - spotDays
line13 =| except ValueError:
line14 =| timebandTuple[2] -= 1 # must be February, but is it a leap year? We need to test each date to know
[chain]
formulaType=explicit
description=Returns the non zero pool value of the first working field in the list
argument1=fields
line1 =|fields = fields.split(',')
line2 =|for field in fields:
line3 =| try:
line4 =| value = pool(field)
line5 =| if value: return value
line6 =| except:
line7 =| continue
line8 =|return 0.00
[effectiveDatePlus]
formulaType=explicit
description0=Calculate the actual date for timebands of a curve. This is the same day of the month the given number of
description1=years or months forwards
argument1=months=0
argument2=years=0
line1 =|effectiveDate=curve("", "", "rate_dt")
line2 =|months += effectiveDate.month
line3 =|years += effectiveDate.year + (months - 1) // 12
line4 =|months = (months - 1) % 12 + 1
line5 =|days = effectiveDate.day
line6 =|while True:
line7 =| try:
line8 =| return datetime(years, months, days)
line9 =| except ValueError:
line10=| days -= 1 # must be February, but is it a leap year? We need to test each date to know
[syntheticYield]
formulaType=builtin
description1=Creates a yield curve from the USD Yield curve and the given FX curve. Where there is no timeband in the
description2=FX curve, the formula uses a linear interpolation between the synthetic point before and after
module=syntheticYield
[gtZero]
formulaType=explicit
description=Used when calculating days for commodity volatilities, returns 1 if less than or equal to zero
argument1=days
line1 =|return days if days > 0 else 1
}}}
!! Terms used in this wiki
{{glossary title{
<<list filter "[tag[Glossary]] [sort[title]]" template: [[GlossaryTemplate]]>>
}}}
<<view title link>>: <<view ::Defn text>>
!Welcome
The DataScopeRateFeed is a Windows based [[Python]] application that captures rates from [[Reuters Datascope Select]] and builds them into into curves suitable for entry into [[Quantum]] and [[Risk]]. Supported curves include [[FX]], commodities, interest rates (including ratesetting rates), option volatilities and securities.
I created this product to capture and manage rate curves for OakvaleCapital. It was highly successful. Unfortunately, the company was not, and when it was closed, the administrators granted me the use of the scripts I had written. This application will run as is in a SunGard AvantGard [[Quantum]] implementation, but you should budget a week to get the configuration files right. However, I expect this might be a very useful starting point for anyone using the Thomson Reuters' DataScope Select product.
The application is written in [[Python]] v3 and is designed to run under Windows. (I have run the capture program, DataScopeRateFeed.pyw, under Linux, so Windows is only required if you wish to feed the rates into a Treasury system vis a DCOM library, such as the one packaged with AvantGard [[Quantum]] and AvantGard [[Risk]].) It uses Tkinter for the UI, SQLite for internal data management, [[FTP]] to download rates, Excel for a manual rates (used for rates no available in DataScope Select), INI files for configuration, Python for inline scripting and jQuery/backbone.js/underscore.js/RPC for a intranet based dashboard.
The application comprises
# DataScopeRateFeed - stand alone application that captures rates into a pool database and builds them into curves n the curves database.
# QuantumRateFeed - stand alone application that writes live and end of day curves in to AvantGard Quantum using the published DCOM library
# RiskRateFeed - stand alone application that manages the general replication of rates from AvantGard Quantum to AvantGard Risk and handles the special cases.
# ExportUtility - stand alone application that exports rate curves showing how the curves were built. It also manages the inline documentation. It has an batch mode that allows daily rate snaps to be archived for reference purposes.
Let me know if you find this application useful. If I am free, I am happy to quote on enhancements. However, anyone with decent Python skills should have no problem modifying the code.
! Latest Updates
<<timeline modified 5>>
! Contents
This wiki is a slice of the [[RMS]] internal wiki. I have selected all articles pertaining to the RateFeed and tried to remove most of the OakvaleCapital specific information (as it is a distraction). If you are curious abuot some of the sideways references in the wiki, do no hesitate to [[Contact Us]].
The MainMenu (on the right) shows the key articles in this wiki. The tags (on the left of each article) should assist in jumping to related articles. I also have a convention of having a "References" section at the end of every article to show related information
! History
<<tiddler [[RateFeed##History]]>>
! Author
The DataScope RateFeed was written by MauriceManeschi when he was an OakvaleCapital employee.
These articles describe how to perform an action
The term "income securities" is a little misleading but represents the rate name for price-based quoting investment instruments. These differ from Bonds which are Yield based instruments.
These have their own Issuer/Instrument Rates under Interest Rates
!! References
Defn: the rate name for price-based quoting investment instruments
<<tiddler [[Income Securities]]>>
@@background-color: #eef;''Notice:'' This application was designed to run on one site. It is likely that any installation on another site will require some software modification. Please contact MauriceManeschi to discuss.@@
To install the DataScopeRateFeed
# Install [[Python]] v3+ on a suitable server.
#* Ensure the server has FTP access (through your firewall) to the Reuters Site
# Copy the files into a folder on that server
#* You probably should create a [[Bazaar]] branch so you can manage changes
# Create a [[configuration file|Configuration Files]]
#* Ensure you have your Reuters DataScope user name and password
# Run the DataScopeRateFeed
#* It will complain if there is anything wrong in the configuration files and give you a chance to fix it.
#* It will create an empty [[curves database]] and [[pool database]]
# Run the other utilities as required
#* I suggest the ExportUtility be scheduled to run at key times
There is more detail in the [[Configuration]] articles:
<<list filter "[tag[Configuration]] [sort[title]]">>
Live Rates are the opposite of [[End of Day]] rates; they update through out day and should only be used for valuation when the as at date is today. All curves built by DataScopeRateFeed are written to Quantum as [[Live Rates]] unless so flagged in the [[Configuration Files]] curve definitions.
!! References
Defn: rates that change throughout the day
/***
|''Name:''|LoadRemoteFileThroughProxy (previous LoadRemoteFileHijack)|
|''Description:''|When the TiddlyWiki file is located on the web (view over http) the content of [[SiteProxy]] tiddler is added in front of the file url. If [[SiteProxy]] does not exist "/proxy/" is added. |
|''Version:''|1.1.0|
|''Date:''|mar 17, 2007|
|''Source:''|http://tiddlywiki.bidix.info/#LoadRemoteFileHijack|
|''Author:''|BidiX (BidiX (at) bidix (dot) info)|
|''License:''|[[BSD open source license|http://tiddlywiki.bidix.info/#%5B%5BBSD%20open%20source%20license%5D%5D ]]|
|''~CoreVersion:''|2.2.0|
***/
//{{{
version.extensions.LoadRemoteFileThroughProxy = {
major: 1, minor: 1, revision: 0,
date: new Date("mar 17, 2007"),
source: "http://tiddlywiki.bidix.info/#LoadRemoteFileThroughProxy"};
if (!window.bidix) window.bidix = {}; // bidix namespace
if (!bidix.core) bidix.core = {};
bidix.core.loadRemoteFile = loadRemoteFile;
loadRemoteFile = function(url,callback,params)
{
if ((document.location.toString().substr(0,4) == "http") && (url.substr(0,4) == "http")){
url = store.getTiddlerText("SiteProxy", "/proxy/") + url;
}
return bidix.core.loadRemoteFile(url,callback,params);
}
//}}}
[[Home]]
[[Download]]
[[Installation]]
[[Contact Us]]
DataScopeRateFeed QuantumRateFeed RiskRateFeed ExportUtility RateFeedDashboard [[Glossary]]
<<tiddler [[Manual RateFeed]]>>
[>img[http://farm8.staticflickr.com/7446/12603483055_0433384b6b.jpg][http://www.flickr.com/photos/64724523@N03/12603483055/]]
Manual RateFeed is an Excel workbook that contains Rates not available from [[Reuters]].
It is identified in the [[Configuration Files]], along with the name of the worksheet contains the rates. That sheet has field names in the top row, the first assumed to be [[RIC]]. (Make the field names the same has those use by Reuters for your sanity sake.)
The [[Manual RateFeed]] used by OakvaleCapital had many sheets for different rate categories. Most cells in those sheets were protected so that the user, typically [[FRM]], could not make data errors. A hidden sheet referenced these inputted rates and laid them out to suit the DataScopeRateFeed.
!! References
* [[Configuration Files]]
Defn: The maintainer of the DataScopeRateFeed
Maurice Maneschi is the maintainer of the DataScope RateFeed. He can be contacted via dsrf at redwaratah dot com.
! Maintenance tools
[[Control panel|http://dsratefeed.tiddlyspot.com/controlpanel]]
<<tiddler [[month end]]>>
Nagios is a powerful monitoring system that enables organizations to identify and resolve IT infrastructure problems before they affect critical business processes. //ref their web site//
OakvaleCapital used Nagios to both monitor server and network uptime and also to detect undesirable data conditions on those servers.
!!Reference
* http://www.nagios.org
Defn: A server status monitoring tool
OakvaleCapital had a dedicated PC that flashed through Nagios screens. It was key to fast response to errors, such as [[Reuters Datascope Select]] delayed snaps.
!! References
Defn: A screen showing current [[Nagios]] statuses
Oakvale Capital was the employer of MauriceManeschi when he wrote the DataScope RateFeed. The company stopped trading in December 2013. The administrators permitted him to take and use the scripts he had written during his time in Oakvale.
Within the context of managing the RateFeed, there were three departments that were intimately involved:
* [[RMS]] developed and maintained the RateFeed. They monitored it operationally
* [[FRM]] were the managers of the RateFeed
* [[Operations]] supported clients whose reports depended on the timely construction of rate curves
!! References
Defn: the original user of the RateFeed
Operations was a division of OakvaleCapital. They were major consumers of rates and curves, albeit indirectly.
!!References
Defn: a rates consumer
/***
|''Name:''|PasswordOptionPlugin|
|''Description:''|Extends TiddlyWiki options with non encrypted password option.|
|''Version:''|1.0.2|
|''Date:''|Apr 19, 2007|
|''Source:''|http://tiddlywiki.bidix.info/#PasswordOptionPlugin|
|''Author:''|BidiX (BidiX (at) bidix (dot) info)|
|''License:''|[[BSD open source license|http://tiddlywiki.bidix.info/#%5B%5BBSD%20open%20source%20license%5D%5D ]]|
|''~CoreVersion:''|2.2.0 (Beta 5)|
***/
//{{{
version.extensions.PasswordOptionPlugin = {
major: 1, minor: 0, revision: 2,
date: new Date("Apr 19, 2007"),
source: 'http://tiddlywiki.bidix.info/#PasswordOptionPlugin',
author: 'BidiX (BidiX (at) bidix (dot) info',
license: '[[BSD open source license|http://tiddlywiki.bidix.info/#%5B%5BBSD%20open%20source%20license%5D%5D]]',
coreVersion: '2.2.0 (Beta 5)'
};
config.macros.option.passwordCheckboxLabel = "Save this password on this computer";
config.macros.option.passwordInputType = "password"; // password | text
setStylesheet(".pasOptionInput {width: 11em;}\n","passwordInputTypeStyle");
merge(config.macros.option.types, {
'pas': {
elementType: "input",
valueField: "value",
eventName: "onkeyup",
className: "pasOptionInput",
typeValue: config.macros.option.passwordInputType,
create: function(place,type,opt,className,desc) {
// password field
config.macros.option.genericCreate(place,'pas',opt,className,desc);
// checkbox linked with this password "save this password on this computer"
config.macros.option.genericCreate(place,'chk','chk'+opt,className,desc);
// text savePasswordCheckboxLabel
place.appendChild(document.createTextNode(config.macros.option.passwordCheckboxLabel));
},
onChange: config.macros.option.genericOnChange
}
});
merge(config.optionHandlers['chk'], {
get: function(name) {
// is there an option linked with this chk ?
var opt = name.substr(3);
if (config.options[opt])
saveOptionCookie(opt);
return config.options[name] ? "true" : "false";
}
});
merge(config.optionHandlers, {
'pas': {
get: function(name) {
if (config.options["chk"+name]) {
return encodeCookie(config.options[name].toString());
} else {
return "";
}
},
set: function(name,value) {config.options[name] = decodeCookie(value);}
}
});
// need to reload options to load passwordOptions
loadOptionsCookie();
/*
if (!config.options['pasPassword'])
config.options['pasPassword'] = '';
merge(config.optionsDesc,{
pasPassword: "Test password"
});
*/
//}}}
Python is the scripting language that DataScopeRateFeed, QuantumRateFeed, RiskRateFeed and ExportUtility are written in.
All these are written in version 3 of Python.
!! References
* http://www.python.org
Defn: the scripting language that DataScopeRateFeed, QuantumRateFeed, RiskRateFeed and ExportUtility are written in.
PythonWin is a Windows based [[Python]] IDE developed by Mark Hammond. it is very good for prototyping concepts in Python.
!!References
* http://sourceforge.net/projects/pywin32/
Defn:a Windows based [[Python]] IDE
[[QTOM]] is a DCOM library for accessing [[Quantum]] functionality directly.
SunGard provide a sample for how to write [[Rates]] to [[Quantum]] in an Excel workbook. The QuantumRateFeed is based on that code.
!! References
* SunGard
* [[Quantum]]
Defn: a DCOM library that allows rate curves to be written to [[Quantum]].
QTProduction is the [[Quantum]] database.
!!References
Defn: the [[Quantum]] database.
AvantGard Quantum is a treasury system available from SunGard
!! References
* SunGard
Defn: is a treasury system
[>img[http://farm4.staticflickr.com/3774/12601081934_2810a0ee62.jpg][http://www.flickr.com/photos/64724523@N03/12601081934/]]
[[QuantumRateFeed]] is now a Python application - implemented as part of the new DataScopeRateFeed. QuantumRateFeed inserts [llive rates|Live Rates]] (that get updated at specific times) and [[end of day rates|End of Day Rates]] (only inserted once).
It is written in [[Python]] and is located in http://launchpad.net/dsratefeed.
!! Business Decisions
The following decisions were made during the implementation of the QuantumRateFeed:
# Quantum has code that calculates spot days and dates from static data. The QuantumRateFeed inserts these fields as empty strings (thus allowing [[Quantum]] to calculate them) even though DataScopeRateFeed has already calculated them.
#* This logic is not carried through to the timebands as some commodities have specific dates they wish the timebands to fall on.
#* The old QuantumRateFeed used to do this for [[End of Day]] only; for [[Live Rates]] it would calculate spot days and dates. MauriceManeschi deemed this behaviour as unwarranted for the new QuantumRateFeed
# To insert a security, you need its ISIN number. QuantumRateFeed gets this ISIN by pulling up an MMDEALS form and filling in the instrument. The refresh gives the ISIN.
!! References
* [[How to set up Quantum RateFeed|Set up Quantum RateFeed]]
* [[How to set up Risk RateFeed|Set up Risk RateFeed]]
* [[RateFeed]]
** DataScopeRateFeed
** RiskRateFeed
A [[RIC]] is a //R//euters //I//nformation //C//ode. It is used to define a rate in Reuters. The [[Configuration Files]] define how [[RIC]]s are captured from [[Reuters Datascope Select]] and built into curves.
@@background-color:#AAF;''Note'': The keys for rates in the [[Manual RateFeed]] are also called [[RIC]]s to simplify documentation. These "~RICs" are prefixed by "MRF!"@@
!! Usage
!!! DataScope Select
[[Reuters Datascope Select]] allows a number of instrument lists to be defined. Although not required, [[RMS]] keys each instrument off its [[RIC]].
!!! DataScope RateFeed
The DataScopeRateFeed writes all data captured from [[Reuters Datascope Select]] to the [[pool database]]. It is keyed by its [[RIC]] in the ''currentRates'' table. [[Configuration Files]] reference rates from the [[pool database]] via their [[RIC]].
!!! Common Fields
|! Code |! Usage |
|//Trade Date// |commonly used to determine the effective date of many rate curves |
|//Universal Bid Price// |the best bid price |
|//Universal Ask Price// |the best ask or offer price |
|//Primary Activity// |is used to reference bid rates (e.g. BBSY) |
|//Secondary Activity// |is used to reference offer rates (see volatilities) |
|//Cross Rate Factor// |is the scaling factor for forward points |
!! Reference
* [[RateFeed Formula]]e
!!! Reading ~RICs
<<<
I'm afraid we don't cover the basis info in any other template. However the RIC gives you the information when you know how to read it.
Please find attached information about the Swap ~RICs for the major currencies.
A few examples:
: ~USDSB3L3Y=RR -> SB=~Semi-Annual Bond 3L= 3 Month Libor
: ~USDAM3L3Y= -> AM=Annual Money 3L=3 Month libor
: ~EURAB6E3Y= -> AB=Annual Bond 6E=6 Month Euribor
<<<
RMS was a division of OakvaleCapital responsible for the operation and maintenance of the RateFeed. MauriceManeschi was a member of RMS
!! References
Defn: a division of OakvaleCapital responsible for the operation and maintenance of the RateFeed
~RateCheckerEOD.py is a script used in OakvaleCapital to confirm that [[End of Day]] rates were successfully posted.
I have not included it in this package as it is very specialised towards the OakvaleCapital handling of rates, and will not translate well. However, if you are interested in trying it, please [[Contact Us]].
!! References
Defn: script used in OakvaleCapital to confirm that [[End of Day]] rates were successfully posted.
[>img[http://farm8.staticflickr.com/7329/12598396075_32719f5c4b.jpg][http://flic.kr/p/kch5er]]
[[RateFeed]] is a collection of processes that results in rates and margins being updated into [[Quantum]] and [[Risk]].
[[Rates]] are mostly captured from [[Reuters Datascope Select]]. [[FRM]] sources the remainder of the rates and margins from a range of sources, principally [[Bloomberg]], and enter them mostly into the [[Manual RateFeed]].
The DataScopeRateFeed uses the rates in the [[Manual RateFeed]] and [[Reuters Datascope Select]] to put the [[Rates]] into the [[pool database]]. It then uses these rates to build curves which it writes to the [[curves database]].
The QuantumRateFeed reads the [[curves database]] and writes the curves into [[Quantum]] via the [[QTOM]].
RiskRateFeed reads rates from [[Quantum]], backdated to the last [[month end]] to pick up edits and new [[Income Securities]], and writes them into [[Risk]].
!! Configuration
The DataScopeRateFeed is configured via a number of configuration files described in the [[Configuration Files]]. The QuantumRateFeed and RiskRateFeed are configued via the {{btn title{Configuration}}} button on the user interface.
The instruments to be captured are defined in [[Reuters Datascope Select]] Instruments lists.
!! Validation
[[FRM]] perform rates validation at [[month end]], or as required.
[[RMS]] use [[Nagios]] to confirm that the various processes that make up the [[RateFeed]] are running. Further, [[RMS]] uses the [[RateCheckerEOD.py]] to confirm that [[Rates]] have been updated in [[Quantum]] and are not "blatantly" wrong.
!! Business Decisions
<<tiddler [[Configuration Files##Business Decisions]]>>
<<tiddler [[QuantumRateFeed##Business Decisions]]>>
!! Verification
* [[RateCheckerEOD.py]] sends daily email
* [[RateFeedHealth.py]] display on the [[Nagios Viewer]]
!!! Nagios errors
OakvaleCapital used a [[Nagios]] system to monitor various error conditions in the DataScopeRateFeed and QuantumRateFeed, and to ensure they were running. The following error messages indicate what was monitored:
; CRITICAL: Failed to connect to database server
: The QuantumRateFeed could be checked as [[Nagios]] could not connect to the database server. There will be lots of other errors explaining this
; CRITICAL: No live updates found in Quantum
: No live updates found yesterday or today in [[Quantum]]. Probably the QuantumRateFeed is not running. Try restarting it.
; CRITICAL: There have been no live updates on Quantum since yesterday
: No live updates found since yesterday in [[Quantum]]. This suggests wither the DataScopeRateFeed failed to do its 3:00am build, or QuantumRateFeed is down.
; CRITICAL: DataScope RateFeed does not appear to be running
: The DataScopeRateFeed is not responding to health checks
; CRITICAL: __name.ini__ was edited on __ddd dd-mmm__ and not been committed
: This configuration file needs to be committed to [[Bazaar]]
; CRITICAL: hosted.datascope.reuters.com has an IP address of __nnn.nnn.nnn.nnn__. Expected 192.165.219.255
: The domain name for [[Reuters Datascope Select]] has changed. Please update the [[Acorn Firewall]] with the new address
; WARNING: Quantum RateFeed live updates are late
: There are live rates for today, but they should have updated by now. Please check
; WARNING: DataScope RateFeed has a software error
: There has been a software error in DataScopeRateFeed. Raise a work item. (This error does not clear until you restart the DataScopeRateFeed; so do this once the analysis of the defect is complete.)
; WARNING: DataScope RateFeed has a configuration error
: There has been a configuration error in DataScopeRateFeed. Investigate the log file for the reason. (This error does not clear until you restart the DataScopeRateFeed or 3:00am the next day; generally you woudl restart the DataScopeRateFeed once the analysis of the issue is complete.)
; WARNING: DataScope RateFeed Snap ''name'':</nowiki> ''some message''
: The DataScopeRateFeed is expecting a snap that is not on [[Reuters Datascope Select]]. This could be because:
:* [[Reuters]] has an error - check the extracted files for a "processing 0%" message (see {{TT|6771}})
:* The times for the snap do not match between [[Reuters Datascope Select]] and the configuration files
:* Lots of [[RIC]]s have expired - check the DataScopeRateFeed log file
; WARNING: __name.ini__ needs to be committed
: This configuration file has been edited in the last 24 hours, but still needs to be committed to [[Bazaar]]
!! References
* DataScopeRateFeed
** [[Configuration| Files]]
* QuantumRateFeed
* RiskRateFeed
* The description of the End of Day process is provided here: [[End of Day (Ratefeed schedule)]].
* [[Rates]]
!!! History
* Version 3.9 (Anaconda) in October 2013 handled a couple of minor format changes in the files coming from [[Reuters Datascope Select]].
* Version 3.8 (Krait) in March 2013 changed the collection of files to be processed on [[Reuters Datascope Select]] to be more robust.
* Fixed the interpolation module in February 2013. I changed version of DataScopeRateFeed to 3.7a to signify that the change happened but not to the actual [[RateFeed]]
* Version 3.7 (Boa) fixes an obscure defect with the order in which curves are built and when to give up. (Clients and [[RMS]] will not notice any difference in behaviour.) It was released in January 2013
* Version 3.6 (Asp) is a minor release to accept a changed "notes" files from v7.1 of the [[Reuters Datascope Select]] in November 2012
* Version 3.5 (Basilisk) introduced [["MinimumValue"|DataScope RateFeed/Configuration#Curve_definition]] and allows the [[RateFeed Export Utility]] to run in batch mode in August 2012
* Version 3.4 (Cobra) in June 2012 contained a small fix to DataScopeRateFeed and RiskRateFeed to handle runtime errors
* Version 3.3 (Adder) in April 2012 change to treat the USD swap curve as Quarterly when calculating synthetic field
* Version 3.2 (Taipan) in April 2012 to represent the addition of the RiskRateFeed the the [[RateFeed]] application suite.
* Version 3.1 Viper in February 2012
** The version jump is to indicate that version 1 run on Quantum v3 (in excel). For Quantum v4, the excel files were significantly rewritten, so is version 2. Version 3 is the Python version under Python 2.5. Version 3.1 is the Python version under Python 3.2. It also gets rid of [[RateFeedCopy.xls]]
* Version 1.4 Galah in July 2011
* Version 1.3 Red-capped Robin in July 2011
* Version 1.2 Flamingo in July 2011
* Version 1.1 in June 2011
* New application released in May 2011. Before this, the [[RateFeed]] was a collection of spreadsheets
<<tiddler RateFeedDashboard>>
<<tiddler ExportUtility>>
[[DataScope Configuration files|Configuration Files]] use [[RateFeed Formula]] to define calculation of rates. Only the curve definitions use the formula actively, although the explicit formulae defined in the [[Configuration Files]] use them to define other functions.
!! Usage
Formulae are written in the [[Python]] language. However, they have access to the following functions (all defined in [[Formula.ini]]):
; curve(name, timeband, field)
: Retrieve the field for the given curve and timeband from the curve data. If the name, timeband or field are blank, use the current curve, timeband and field respectively.<br>@@background-color:#AAF;''Note'': You may add a fourth field to this call with a value - this will write the given value to the [[curves database]]. See the interpolation formula for a sample of its use.@@
; ditto()
: Apply the formula for the same field from the timeband before.<br>@@''Warning'': The timeband before is determine from the number of days in each timeband, not from their order in the configuration file.@@
; pool(field name, ric)
: Retrieve the field for the given [[RIC]] from the [[pool database]]. If the [[RIC]] is not provided, use the row [[RIC]]
----
; bond(name)
: Retrieve the yield from the given bond
; chain(list of fields)
: return the first non zero value from the fields (that is, pool(“field name”)), checked in order
; effectiveDatePlus(months, years)
: Calculate the actual date for timebands in a curve. This is the same day of the month the given number years or months forwards
; interpolate()
: Calculate the linear interpolation for the current field of the current timeband in the current curve. It uses the rates before and after based on the number of days for each time band. <br>@@background-color:#AAF;''Note'': This method schedules a recalculation so that it runs after all other formulae for all other cells has completed.@@
; spotDays()
: calculate the number of days after spot (avoiding weekends using modified following)
; syntheticYield()
:calculate the yield from the forward points and the USD yield using ACTUAL/360.
!!! Lesser known functions
The following functions are also available, but at present are only used in builtin functions:
; curveDetails(curveName)
: Return the currency of the given curve (or the current curve if not provided) in a dictionary
; findCurve(ty, ccy)
: Return the name of the curve that matches the type (one of "fx" or "yield") and currency provided
; fwdPrice(ccy, timeband)
: Return the current forward price (spot plus the forward points) for the given currency and timeband
; fwdYield(ccy, timeband)
: Return a tuple of the current forward yield and basis for the given currency and timeband
; setCurveDependency(dependant, dependsOn)
: Advise DataScopeRateFeed that every time the dependsOn curve updates, the dependant curve must be recalculated. <br>@@background-color:#AAF;''Note'': This is used for synthetic yield curves at present to flag that a synthetic curve needs to be rebuilt when either the USD Yield curve or the [[FX]] curve for the currency updates.@@
; timebands(curveName)
: Return a list of the timebands (in short codes, like 'sp', '1m, '4y') for the given curve (or the current curve if not provided)
!! Design Notes
[[RateFeed Formula]]e are basically [[Python]] code with the following exceptions:
* importing [[Python]] modules or opening files is forbidden
** The following classes are preimported: {{{ from datetime import datetime, date, timedelta, time}}}
* Formula class is imported. It has the following attributes:
** ~CalculationError() ⇒ raise this exception if you need to report an error in calculations
** ~PluginFunction ⇒ uses this as a parent class if you need to do simple maths on a ''evaluateLater'' formmula
** currentCurve ⇒ the name of the curve being calculated
** currentElement ⇒ the name if the current timeband or security being calculated
** currentField ⇒ the name of the current field being calculated
** currentRic ⇒ the [[RIC]] assigned to the ''currentElement''
* various functions are added (as described in ''Usage'' above)
** three of these (''bond'', ''pool'', ''curve'') have the special ability to assume quotes around string literals
*** For example:
**** curve(,,bid) ⇒ curve("","","bid") = return the bid rate for this timeband and this curve
**** bond(Inv Bond ~CGA1212) ⇒ bond("Inv Bond ~CGA1212") = return the rate for this bond
**** pool(trade date,~CADFSR1M=) ⇒ pool("trade date","~CADFSR1M=") = return the trade date for the Canadian LIBOR 1 month
*** This was a functional requirement to improve the editability of the configuration file, and has not been exhaustively tested
*** If it fails, or you need to uses these functions with literals and variables, go back to using quotes
* The ''reset()'' method on the top of [[dsConfiguration.py]] defines where (internally) the built-in functions are located in the code
** If a built-in function has a ''module'' attribute (in its configuration), it will be implemented via the ''run()'' method in a [[Python]] module of that name.
!!! Formulae
The formulae in the effective date and curve elements must be one line long. If you need longer, create a function (in [[Formula.ini]]).
A formula must return a rate (or date), or an object that has an ''evaluateLater()'' method. These latter objects are evaluated after everything else in the curve has been calculated.
!!! Functions
Functions can be builtin or explicit (see [[Formulae in Configuration files|DataScope RateFeed/Configuration#Formulae]]). Once defined, these can be used in formulae or other functions.
If you experience a problem and cannot perform a calculation, {{{raise Formula.CalculationError('Message describing the error')}}} This will cause the message to be written to the log (along with the current curve, timeband and field) and the "Configuration" health bar to become "sick".
!!! Plugins
Plugin are [[Python]] modules that have a ''run()'' and ''reset()'' method. These should import the Formula class (from [[dsFormula.py]]) to access other functions and defaults.
; Some of the code in interpolate.py
{{{
"""
The Interpolator plug in is used by the DataScope RateFeed. It calculates a linear interpolation.
It uses 'evaluateLater' to allow the other timebands to be populated before this one kicks off.
"""
from dsFormula import Formula
def reset():
"""Called when imported"""
Interpolator.interpolators = {}
def run():
"""The entry point of this plugin. No arguments are taken; everything is implicit from the Formula settings"""
Formula.style = 'interpolated'
# This curve is self dependent - any change to this curve means that the interpolation must be redone
# even though the interpolation is on an evaluateLater call.
Formula.GLOBALS['setCurveDependency'](Formula.currentCurve,Formula.currentCurve)
# The evaluation object can be reused
try:
return Interpolator.interpolators[Formula.currentCurve][Formula.currentElement][Formula.currentField]
except:
return Interpolator()
}}}
!! Testing
* Formula changes should be tested on a [[Test Server]] because if there is a major error, DataScopeRateFeed will not start
** If you break this rule and get into trouble, you should be able to restore to the last working version of the ini file using [[Bazaar]].
* With a bit of care, most calculations could be tested in the "Interactive Window" of the [[PythonWin]] application
* As values from the [[pool database]] change change during the day and during the month, it is perfectly reasonable to create one or more test curves in the [[Configuration Files]] with "~WriteToQuantum=No", and then take ad hoc exports to check which formula provides the best result.
!! References
* [[Configuration Files]]
* [[Formula.ini]]
~RateFeed is the name of the user that the RateFeed applications ran under in OakvaleCapital
!! References
Defn: a Windows network user name
~RateFeedCopy.xls was an old export file from the Excel based RateFeed. It was supported in the first couple of versions of the DataScopeRateFeed before being dropped. Some of the Excel generating code that supported it is still in dsExcel.py.
!! Reference
Defn: a historical artifact
[>img[http://farm8.staticflickr.com/7306/12628471685_e908452424.jpg][http://www.flickr.com/photos/64724523@N03/12628471685/]]
The RateFeedDashboard is a thin-client application running on the Operational Server via http://sydrms11/ratefeed . It provides the facility to remotely monitor and manage the DataScopeRateFeed.
!! Prerequisites
A modern browser (MSIE v7+, or Firefox v3+) and network access to the Operational Server.
!! Usage
To use the dashboard, navigate to http://sydrms11/ratefeed and click one of the menu buttons. Most buttons provide a popup area that shows some part of the functioning of the DataScopeRateFeed.
Lay out the popups the way you like and bookmark.
!!! RIC Popup
Choose this option to see all the fields captured from [[Reuters Datascope Select]] (or [[Manual RateFeed]]) for the given code. The display will refresh every 15 minutes
!!! Curve Popup
Choose this option to see a curve in the [[curves database]] (as built from the current [[pool database]]). It should reflect the current live curve in [[Quantum]].
!!! Queue
Choose this option to see the current job queue for the DataScopeRateFeed. This popup updates once a minute.
!!! Health
Choose this option to see the current health bars of the DataScopeRateFeed. This popup updates once a minute.
!!! Logs
Choose this option to see the log entries for the DataScopeRateFeed. You can filter the list by date, severity and job. As the log file can be long, it is displayed in 25 row lots. Use the export facility to see all entries for a day. The display will refresh every 5 minutes.
!!! Check ~DataScope
Click this button to instruct the DataScopeRateFeed to log on to [[Reuters Datascope Select]] within the next minute to look for an ad-doc snap
!!! Refresh
Click this button to force an immediate refresh of all popups.
!! Configuration
To set up a RateFeed Dashboard on a new server
# Set up the server as a WAMP server
# check out {{{restful.py}}} and {{{rfdlib.py}}} out of {{{Applications/Python/Scripts/Conan/cgi-bin}}} in [[Bazaar]] into the Apache cgi folder.
#* @@background-color:#AAF;''Note'': if you have ~Conan on the same server, this is done already for you@@
#* validate the settings of "SERVERS=" and "~CACHE_DB=" in {{{rfdlib.py}}}
# check out the {{{Quantum/RateFeed/Dashboard}}} from [[Bazaar]] into the "ratefeed" folder of "htdocs"
# Create a {{{d:\treasury\DashboardData}}} folder
!! Technical Notes
* The [[RateFeed Dashboard]] is a ~JavaScript application, but as MauriceManeschi's first implementation of such, is a bit ugly
** [[Backbone.js|http://documentcloud.github.com/backbone]] is used to manage the flow of control around the application
** {{{index.html}}} is the web page that launches the application. It is located in {{{d:\treasury\apache\htdocs\ratefeed}}} and has been committed into [[Bazaar]]
** {{{dashboard.js}}} contains the main code of the application. It is located in {{{d:\treasury\apache\htdocs\ratefeed\scripts}}} and has been committed into [[Bazaar]]
** The {{{scripts}}} folder also contains jQuery.js, backbone.js and underscore.js (as suggested by the backbone web site).
** The popup menu is located in {{{D:\treasury\apache\htdocs\ratefeed\AdvancedPageMenu}}}
* Although all information is sourced via the DataScopeRateFeed running on the production server, requests are directed to {{{restful.py}}}, which:
** communicates with the application via a restful interface
** caches responses to prevent overloading the DataScopeRateFeed
* {{{restful.py}}} is located in {{{d:\treasury\apache\cgi-bin}}} and committed to [[Bazaar]] under {{{Applications/Python/Scripts/Conan/cgi-bin}}}.
* The apache configuration file has been modified to convert ~URLs starting with "/rfd/" to be directed to {{{restful.py}}}
{{{
# RESTful interface for RateFeedDashboard
RewriteEngine on
RewriteRule ^/rfd/(.+) /cgi-bin/restful.py?p=$1 [PT]
}}}
* The cache is stored in a SQLite database located in {{{D:\treasury\DashboardData}}}.
** It is not required between invocations and will be recreated if deleted.
!!! Limitations
* There is no security on the application, but this could be added if required
* The application does not look at QuantumRateFeed or RiskRateFeed, because these applications have very little business logical; they are largely driven off the the work of the DataScopeRateFeed. If required, this dashboard could read the text log file on the Production server.
!! References
* DataScopeRateFeed
OakvaleCapital had a scheduled task on their production services that produces a twice daily [[export.xlsx|ExportUtility]]. It does so by running the ExportUtility with a "Curve" action .
The exports are placed in the treasury folder {{{f:\treasury\Rate Feed\Export}}} at the time.
!! Configuration
; In {{{k:\treasury\scripts}}}
{{{
REM Create a daily ratefeed export
@echo off
set ts=%date:~10,4%-%date:~7,2%-%date:~4,2%-%time:~0,2%-%time:~3,2%
"k:\treasury\rate feed\bin\ExportUtility.pyw" -o "k:\treasury\rate feed\Export" _
-l "k:\treasury\rate feed\Log Files\Export.log" -x "export%ts%.xlsx" Curve
}}}
The purpose of this BAT file is to time stamp the export file.
!!! Scheduling
|!Task |{{{K:\treasury\scripts\RateFeedExport.bat}}} |
|!Timing |10:40 and 16:45 every day |
|!Username |OAK\RateFeed (call for password) |
|!Flags |Run only when logged on |
|~|Start "Hidden" or "Iconised"|
|!Stop task after |15 minutes (looks like 1 hour is the minimum for Windows 2008) |
!! References
* ExportUtility
[>img[http://farm8.staticflickr.com/7452/12628183534_59f43cc3d7.jpg][http://www.flickr.com/photos/64724523@N03/12628183534/]]
The //RateFeed Health Bars// are a display on the [[Nagios Viewer]] that displays the DataScopeRateFeed health bars (for using in the [[Nagios Viewer]]'s slide show.
The display is implemented in [[Python]], which uses ~ImageMagick to draw the bars.
!! Configuration
[[RateFeedHealth.py]] is configured in the top of the script. It is scheduled to run every minute on the [[Nagios Viewer]] by cron.
!! Design Notes
* Checks which production server that DataScopeRateFeed is running on.
** If it is running in neither, it calls up the last built image and put the text "~RateFeed not responding" on it
** This means either the DataScopeRateFeed is down or the WAN link between Sydney Office and the hosting centre. In the latter case, the [[RateFeed]] will keep running without us.
* Requests (via the RPC call "getHealth()") the health bars from the production DataScopeRateFeed
* Builds a display of the bars, using ~ImageMagick
!! References
* DataScopeRateFeed
* [[Nagios Viewer]]
Rates is a generic term for market information use to price financial instruments. In terms of the DataScopeRateFeed, it covers:
* Spot rates and forward points for foreign exchange
* Cash rates and swaps rates that describe interest rates
* Futures prices for commodities
* Interest Rate Option Volatility
* Foreign Exchange Option Volatility
* Commodity Option Volatility
* Bond coupons
!! Rate timings
Rates trade at different times around the world. Traditionally, financial instruments are priced against the [[End of Day]] rate, and the prices derived are used in accounting. OakvaleCapital notionally set an end of day for each class of rate depending on how their clients were using the rates. You will need to do the same (In the [[Configuration Files]])
!! References
Defn: numbers from [[Reuters]] used to value financial instruments
Reuters is a distributor of [[Rates]]. In terms of the [[DataScopeRateFeed]], you need an account on [[Reuters Datascope Select]] to use the this product.
!! References
* [[Reuters Datascope Select]]
Defn: a distributor of [[Rates]]
Reuters Datascope Select is an enterprise product (from [[Reuters]]) that allows customers to access [[Rates]].
The features used by the DataScopeRateFeed are:
* maintaining instrument lists
* scheduling rate snapshots
* and download them via FTP
!! References
* [[Reuters]]
Defn: an enterprise product for retrieving [[Rates]]
AvantGard Risk is a pricing and risk modeling system distributed by SunGard
!! References
* SunGard
Defn: a pricing and risk modeling system distributed by SunGard
The Risk Replication Manager is a tool that is part of SunGard's AvantGard [[Risk]] product. It replicates static data, trades and rates from [[Quantum]] to [[Risk]].
RiskRateFeed calls the [[Risk Replication Manager]] several times a day to replicate [[End of Day]] curves to [[Risk]]. It also handles rates data which cannot be replicated.
The application name is AGRReplManager
! References
* [[Risk]]
* RiskRateFeed
Defn: a tool to populate [[Risk]] with [[End of Day]] curves
[>img[http://farm4.staticflickr.com/3786/12601717813_e5d3a2ef51.jpg][http://flic.kr/p/kcz6EM]]
The RiskRateFeed is an application that runs under the [[RateFeed user]] on {{ctx02}} (or {{ctx01}} on switch over). It is started by [[RMS]] whenever {{app02}} is restarted and is left running. It manages the transfer, or replication, of rates from [[Quantum]] to [[Risk]] by:
* running the [[Risk Replication Manager]] to replicate all rates
* reading the [[Cross Currency basis margin]] from the intmarg table and writing them as "Yield Margins"
* reading the yield curves from intrates and writing them as a split [[Cash Curve]] and [[Swaps Curve]] in [[Risk]]
Rates in the last 7 days or since the last month end are replicated
@@background-color:#EEF;''Note'': [[Live Rates]] are replicated into [[Risk]], but are not used. (This was an OakvaleCapital decision, but it would generally hold that there is not point stressing a portfolio based on anything other than [[End of Day]] rates.@@
The application has been written in [[Python]] in a single file (RiskRateFeed.pyw). It is located in http://launchpad.net/dsratefeed.
The application checks that the configuration is correct when it is started. Thereafter, it sits in place until one of the scheduled times occur. It then carries out each step and then goes back to sleep. The success of the step is shown in the log window and is written to a file named {{{RiskRateFeed.status}}} (for consumption by [[Nagios]]). It writes greater detail into its log file (which it rotates as it gets full).
!! Configuration
The RiskRateFeed is configured by clicking the {{btn title{Configure}}} button on the window. This pops up a dialog. The values are stored in the windows registry under {{{HKEY_CURRENT_USER\Software\Oakvale\RiskRateFeed}}}.
* Replication Schedule: 10:35, 16:45, 17:10
|! Parameter |! Meaning |
|Log folder |Where the log file and status file are written |
|Risk DCOM Session |The class name of the DCOM object that is used to write rates to [[Risk]]. It can be read from the visual basic definitions in {{{C:\Program Files (x86)\SunGard\AvantGard Risk\4.6\Import\RateFeed.xls}}} (where 4.6 is the current version of Risk) |
|Replication Manager Path |Location of the [[Risk Replication Manager]] executable. |
|Replication scheduled |A comma separated list of 24 hour times when the RiskRateFeed does its rate replication. |
|Quantum username |Needed when running the [[Risk Replication Manager]] in batch |
|~|Quantum password |
|~|Quantum catalog |
|SQL Server |Host name of the SQL Server used by [[Quantum]] |
|SQL Catalog |Database name of [[Quantum]] |
!! Usage
In general, the RiskRateFeed manages the rates in [[Risk]] without human interaction. However, you can force rates across to [[Risk]], typically after they have been changed in [[Quantum]], by setting the date and clicking {{btn title{Replicate Now}}}. (The date defaults to the earliest of 7 days prior and last [[month end]], so as to pick up likely changes in [[Quantum]] as well as [[Income Security]] margins entered during the month.)
!! Design
The design goal of the RiskRateFeed is to use the [[Risk Replication Manager]] for everything. Unfortunately, the [[Risk Replication Manager]] cannot split the [[Quantum]] [[Yield Curves]] without destroying them (so that they do not replicate to [[Risk]]). Also, the [[Risk Replication Manager]] does not expect to find [[Yield Margins|Cross Currency basis margin]] in the [[Quantum]] intmarg table (despite its name).
The RiskRateFeed uses the code found in the {{{C:\Program Files (x86)\SunGard\AvantGard Risk\4.6\Import\RateFeed.xls}}} to insert the yield margins and [[Cash and Swaps Curve]]s correctly. It then calls the [[Risk Replication Manager]] for everything else (as described in Risk help on how to run the [[Risk Replication Manager]] from the command line).
@@background-color:#EEF;''Note'': The Risk DCOM object requires a logged on [[Quantum]] user to work. This the RiskRateFeed must be run from a [[RateFeed user]] desktop.@@
!!! ~RiskRateFeed.status
To allow [[Nagios]] to monitor the process, there is a status file in the log folder. It contains a single line which is either the word "OK" or a count of errors. The modified date of the file can be used to ensure the RiskRateFeed is running.
!!! Log files
The RiskRateFeed writes its progress to {{{k:\treasury\rate feed\log files\RiskRateFeed.log}}}. (Older log files are kept in with a numeric suffix. The RiskRateFeed manages the rotation of these.)
When the RiskRateFeed runs the [[Risk Replication Manager]] to replicate rate, it writes its progress to {{{k:\treasury\rate feed\log files\Replication Manager.log}}}. This only has the most current version. (It would not be hard to keep old versions, but as each replication overwrites the replication from before, I see no value in these.) Note that each time it runs, it returns several error messages "CS Swp Grp is not an available security instrument". It is because we enter [[Cross Currency basis margin]]s into the intmarg table and the [[Risk Replication Manager]] tries to replicate them as security margins, rather than as yield margins. Each of these errors can be safely ignored.
!! Workarounds
* The RiskRateFeed is a work around to the limitations of the [[Risk Replication Manager]].
* In [[Risk]] v4.06, the [[Risk Replication Manager]] does not terminate when it completes the replication. The RiskRateFeed has been modified to kill it in that event ({{TT|10036}})
* The configuration dialog has to be closed twice
* The first time the RiskRateFeed runs (after being restarted), it will fail to replicate the yield margins (with an "invalid parameter" error). Thereafter it runs fine. I have failed to find the reason during implementation, and it is hard to solve as it will not happen with the debugger on.
!! History
* The RiskRateFeed was created from a consolidation of ''~TransferRates.pyw'' and ''~RiskSpecialRateReplication.py'' when neither would work without modification for the upgrade. It was released early to production in April 2012 as it was able to easily resolve the replication of monthly and semi yield curves.
* In February 2011, the script was renamed to //~RiskSpecialRateReplication.py// to include the replication of [[Cross Currency basis margin]]s
* As part of resiliency project, the script folder was relocated from D:\ to K:\ drive where K is mapped to \\oakrmsmscs\iRMS\. Given K drive is only mapped at interactive login, batch job run in scheduled task will not be able to access drive letter K. A workaround is implemented.
* There used to be an Excel Spreadsheet called AG Risk Ratefeed which split the last 5 days of Quantum [[Blended Yield Curve]]s into [[Cash and Swaps Curve]]s and inserted them into [[Risk]]. It used to hang and crash frequently, so was replaced with a script.
* See [[RateFeed]] History for further entries
!! References
* [[RateFeed]]
** DataScopeRateFeed
** QuantumRateFeed
* [[How to set up Risk RateFeed|Set up Risk RateFeed]]
The DataScopeRateFeed reads rates from the [[Reuters Datascope Select]] [[FTP]] site and converts them into curves in the [[curves database]]. In the event of the [[FTP]] site being unavailable for some reason, you can still get the files into DataScopeRateFeed via the "Test" mode functionality. Do the following:
# Stop the DataScopeRateFeed
# Edit "site.ini" and
## change the mode to "Test"
## Ensure that "Test Mode Input" and "Test Mode Processed" are valid paths. <br>@@''Warning'': If you are doing this because of network problems, ensure these paths are not on networks.@@
## Save and close
# Start the DataScopeRateFeed and ensure all bars are green (that could possibly be so)
# Get the CSV files with rates from [[Reuters]] (e.g. perhaps via USB key)
# Put them into the "Test Mode Input" folder
# Watch the log over the [[RateFeed Dashboard]] to ensure rates are being parsed
When the [[FTP]] site becomes available again:
# Stop the DataScopeRateFeed
# Edit "site.ini" and
## change the mode to "Live"
## Save and close
# Start the DataScopeRateFeed and ensure all bars are green (that could possibly be so)
''Note'': In test mode, DataScopeRateFeed ignores:
* Files without the extension ".csv"
* Files with the extension ".ric.csv" (as these do not have rates)
* Files with the extension ".partial.csv" (as these have blank rates in embargoed instruments, and without the notes.txt, we do not know which ones are embargoed)
SQL stands for Sequential Query Language. Within the context of the DataScopeRateFeed, it is how the [[pool database]] and [[curves database]] are read and updated.
!! References
Defn: the language to access the [[pool database]] and [[curves database]]
SQLite is a single user, relational database. DataScopeRateFeed (and its associated applications) use SQLite for there [[pool database]] and [[curves database]].
SQLite is built into [[Python]].
Note that Firefox uses SQLite for managing internal data too. There is a plugin called ~SQLite Manager that lets you manage SQLite databases. I used that frequently to directly query the [[pool database]] and [[curves database]].
!! References
* http://sqlite.org/
* [[pool database]]
* [[curves database]]
* [[Python]]
Defn: a single user, relational database
[>img[http://farm4.staticflickr.com/3747/12581848004_846db4dce9_n.jpg][http://flic.kr/p/kaPg4S]]
@@''Warning'': Once the DataScopeRateFeed and QuantumRateFeed are set up on a server, it should never need to be done again; you would generally restore the treasury folder in the event of a recovery situation.@@
!! Steps
To set up DataScopeRateFeed on a server:
# Log on as the [[RateFeed user]]
# Choose a folder to run from.
#* Generally this is {{{k:\treasury\rate feed}}}, but it can be anywhere
#* It may be on a networked folder
#* I suggest the folder not be across the WAN from where the DataScopeRateFeed application may run, as this would result in a lot of network traffic
# Put your [[Configuration Files]] into a subfolder
#* Typically, this is called {{{Configuration}}}
# Copy {{{s:\rms\tools & scripts\scripts\python\DataScopeRateFeed\bin}}} to {{{k:\treasury\rate feed\bin}}} (or a suitable alternative)
# Create an empty subfolder for downloaded files
#* Typically, this is called {{{k:\treasury\rate feed\Downloads}}}
#* This is needed, even if you do not plan to run DataScopeRateFeed in live mode
# Create an empty folder for log files
#* Typically, this is called {{{k:\treasury\rate feed\Log Files}}}
# Edit "Site.ini" in the configuration folder such that:
## __Mode ⇒ mode__ = the mode you wish to run under
## __Paths ⇒ Log File__ = the name of a log file in the log folder
##* Typically {{{k:\treasury\rate feed\Log Files\DataScope RateFeed.log}}}
## __Paths ⇒ Downloads__ = the path to the download folder
##* Typically {{{k:\treasury\rate feed\ratefeed downloads}}}
## __Paths ⇒ Pool data__ = the path for a [[pool database]]
##* Typically {{{k:\treasury\rate feed\pool.sqlite}}}
##* Note there need not be a [[pool database]] there. If no, DataScopeRateFeed will create it when it first runs
## __Paths ⇒ Curve data__ = the path for a [[curves database]]
##* Typically {{{k:\treasury\rate feed\curves.sqlite}}}
##* Note there need not be a [[curves database]] there. If no, DataScopeRateFeed will create it when it first runs
## __Paths ⇒ Manual RateFeed __ = the path to [[Manual Input.xlsx]]
##* Typically {{{\\oakrmsmscs.oakrms.local\iRMS\Treasury\Rate Feed\Manual\Manual Input.xlsx}}}
## Check ''Test mode'' and ''Clone mode'' below if appropriate
# Create a link on the desktop to {{{k:\treasury\rate feed\bin\DataScopeRateFeed.exe}}} in the "Datascope" subfolder
## If the configuration files are not in "K:\Treasury\Rate Feed\Configuration":
### Edit the Target path to use the configuration subfolder as a parameter (in double quotes){{{"k:\treasury\rate feed\bin\DataScopeRateFeed.exe" "K:\some\funny\path\Configuration"}}}
!!! Test mode
If running in test mode, you need to set:
# __Mode ⇒ mode__ = Test
# __Test Mode Input__ = //path to where the files will be added//
# __Test Mode Processed__ = //path to where the files will be but after processing//
!!! Clone mode
If running in test mode, you need to set:
# __Mode ⇒ mode__ = Clone
# __Clone ⇒ server__ = //the name of the server you want to clone from//
#* If there are two [[DataScopeRateFeeds|DataScopeRateFeed]] running on that server, you will need to set the __Clone ⇒ port__ to be different for each server and then set the appropriate port in the clone
!! Considerations
* Live mode will check the [[Reuters Datascope Select]] [[FTP]] site for each snap. We should only have one DataScopeRateFeed in Live mode to avoid unnecessary traffic on this site, though I am sure we could occasionally have two - remember to shut off the second when done
* If you start in Live mode without a [[pool database]] (or with an old [[pool database]]), the DataScopeRateFeed will download every update in the last seven days. This will be a lot of traffic.
* In theory, one DataScopeRateFeed can have many clones hanging off of it. Clones can hang off clones too.
* If running in '''Live''' mode for several months, consider scheduling [[ArchiveRateFeedDownloads.py]] on the same server.
!! References
* DataScopeRateFeed
** [[DataScope RateFeed/Configuration]]
* [[How to set up Quantum RateFeed|Set up Quantum RateFeed]]
* [[How to set up Risk RateFeed|Set up Risk RateFeed]]
[>img[http://farm6.staticflickr.com/5537/12601385215_74b1cd4897.jpg][http://www.flickr.com/photos/64724523@N03/12601385215/]]
!! Steps
# Log on as the [[RateFeed user]]
# On the desktop, create a short cut to {{{k:\treasury\rate feed\bin\QuantumRateFeed.exe}}}
# Run the application
#* You will probably get an error message advising that the configuration is faulty.
#** If so, click {{btn title{Retry}}} to bring up the configuration dialog
#** If so, click {{btn title{Configuration}}} to bring up the configuration dialog
# Set the:
## "Curve database" to the location of the [[curves database]];
##* Typically {{input|k:\treasury\rate feed\curves.sqlite}}
## "Log folder"" to the location to write log files;
##* Typically {{input|k:\treasury\rate feed\Log Files}}
## "Quantum username" is the user to insert the rates as;
##* Typically {{input|RateFeed}}
## "Quantum password" is the password for this user
## "Quantum catalog" is the database catalogue to insert to;
##* Typically {{input|Production}}
# Click {{btn title{OK}}}
#* The "Activity" should go to "Running"
!! Prerequisites
To run the QuantumRateFeed, the following must be installed:
* DataScopeRateFeed (on some server that can access the K drive for this server)
!! References
* QuantumRateFeed
* [[How to set up DataScope RateFeed|Set up DataScope RateFeed]]
* [[How to set up Risk RateFeed|Set up Risk RateFeed]]
[>img[http://farm8.staticflickr.com/7400/12602056804_9248b5fd57.jpg][http://flic.kr/p/kcAQrs]]
@@''Warning'': Once the DataScopeRateFeed, QuantumRateFeed and RiskRateFeed are set up on a server, it should never need to be done again; you would generally restore the treasury folder in the event of a recovery situation.@@
!! Steps
# Log on as the [[RateFeed user]] (both "LAN" and as a [[Quantum]] user)
# On the desktop, create a short cut to {{{k:\treasury\rate feed\bin\RiskRateFeed.exe}}}
# Run the application
#* You will probably get an error message advising that the configuration is faulty.
#** If so, click {{btn title{Retry}}} to bring up the configuration dialog
#** If not, click {{btn title{Configuration}}} to bring up the configuration dialog
# Set the:
## "Log folder" to the location to write log files;
##* Typically {{input|k:\treasury\rate feed\Log Files}}
## "Risk DCOM Session" to the [[Risk]] DCOM object;
##* Found in the visual basis definitions of {{{C:\Program Files (x86)\SunGard\AvantGard Risk\4.6\Import\RateFeed.xls}}}
## "Replication Manager Path" to the location of the [[Risk Replication Manager]];
##* Typically {{input|d:\AvantGard Risk\v4.4\GUI\AGRReplManager.exe}}
## "Quantum username" is the user to insert the rates as;
##* Typically {{input|RateFeed}}
## "Quantum password" is the password for this user
## "Quantum catalog" is the database catalogue to insert to;
##* Typically {{input|Production}}
## "SQL Server" is the host name of the database server;
##* Typically {{input|oamrmssqlp}}
## "SQL Catalog" is the database catalogue to read from;
##* Typically {{input|QTProduction}}
# Click {{btn title{OK}}}
#* The "Status" should go to "Sleeping"
!! Prerequisites
To run the QuantumRateFeed, the following must be running:
* [[Quantum]] Menu
!! References
* RiskRateFeed
* [[How to set up DataScope RateFeed|Set up DataScope RateFeed]]
* [[How to set up Quantum RateFeed|Set up Quantum RateFeed]]
Virtual pages in this site
This site is project documentation for the http://dsratefeed.launchpad.com site. It is a TiddlyWiki hosted on http://dsratefeed.tiddlyspot.com.
! Site map
* Application pages
** DataScopeRateFeed
** QuantumRateFeed
** RiskRateFeed
** ExportUtility
** RateFeedDashboard
* [[Categories]]
** [[Glossary]]
** HowTo
** [[Shadow]]
** [[Site]]
project documentation wiki
span.btn {
border: 2px outset gray;
padding: 0px 8px 1px 8px;
font: bold 10px sans-serif;
color: #000;
background-color:#eee;
vertical-align: 20%;
}
.glossary {
margin-left: 180px;
font-weight: normal;
font-size: 12px;
color: #000;
}
SunGard is a company that distribute AvantGard [[Quantum]] and AvantGard [[Risk]]
!! References
* http://www.sungard.com
Defn: a company that distribute AvantGard [[Quantum]] and AvantGard [[Risk]]
The Swaps Curve is the component of the [[Yield Curve]] that is quoted in [[Reuters Datascope Select]] on a swaps basis. You need to record the frequency of the swap in the [[Configuration Files]]. Oddly the frequency is not a field in [[Reuters Datascope Select]], but it can often be derived from the [[RIC]] code.
!! References
Defn: a component of the [[Yield Curve]] that is quoted in [[Reuters Datascope Select]] on a swap (monthly, quarterly, semi-annual, annual) basis.
This article describes the technical implementation of the [[DataScope RateFeed]]
!! Data Flow
[>img[http://farm8.staticflickr.com/7426/12628724714_27d1f49b9d.jpg][http://www.flickr.com/photos/64724523@N03/12628724714/]]
The diagram opposite shows how rates start in a Reuters database (black "Res" disk), migrate to the [[DataScope]] [[FTP]] server (red [[DataScope]] box), are stored sequentially in a [[pool|pool database]] and [[curves database]] (green disks) and are populated into the [[Quantum]] Database (black "Q" disk). In particular:
# [[RMS]] have set up a number of scheduled snaps of Reuters' instruments on [[DataScope]]
#* Each snap generates a set of three or four files on the [[DataScope]] [[FTP]] server. The CSV file of each set has the rates data.
# According to the schedules of [[snap configurations in the configuration files|Configuration Files]], the DataScopeRateFeed pulls new files off the [[FTP]] server.
#* The non CSV files are checked for errors or warning pertaining to the rates
#* All files are downloaded into a "RateFeed downloads" folder (as defined in the [["Paths" section the configuration file|Configuration Files]].
#* Each snap has its own job in the job queue and a health bar showing how many usable rates it had
# Usable rates records are written to the [[pool database]]
#* The [[pool database]] keeps all rates, but also keeps a set of the latest good rate for each RIC
#* [[Manual RateFeed]] rates are all written to the [[pool database]] each time the [[Manual Input.xlsx]] is updated
# When rates update in the [[pool database]], the DataScopeRateFeed builds a list of curves that need to be updated
# These curves are built and written to the [[curves database]] by the "Curve Builder job""
#* If curves have errors during building, they are still written to the database, but are flagged as such
#* The [[curves database]] keeps a list of the more recent curves without errors
# The QuantumRateFeed uses the rates in the [[curves database]] to update [[Quantum]]. Each particular:
#* when an error-free curve is written to the [[curves database]] it is written to [[Quantum]] as a live curve
#* when a pre-existing curve is flagged as [[End of Day]] in the [[curves database]], it is written as an [[End of Day (Rate type)]] curve to [[Quantum]].<br>@@background-color:#AAF;''Note'': Some curves are not written; see the ''writeToQuantum'' option in the [[curve definition section of the Configuration Files|Configuration Files]].@@
!! Code
!!! Inline documentation
Use the ExportUtility to generate inline documentation
!! Workarounds
* dsDatascope.py has a regular expression "~NOT_AN_ERROR". Any error message that matches it will be ignored. This should make it easy to manage new versions of [[Reuters Datascope Select]].
** If such changes are frequent, we can move this into the configuration files
!! References
* [[Design Notes]]
You test of the test server, not on the production server.
Put a full copy of the DataScopeRateFeed and its [[Configuration Files]] on the [[Test Server]]. Then change the Site.ini setting "type" to be Clone and point it at the proudction server. from this point, every rate available to production is on this server, so you can test the [[Configuration Files]] to your hearts content.
!! References
Defn: a server that tests changes to the [[Configuration Files]]
TiddlyWiki is a single editor wiki written in ~JavaScript. This [[site|Site]] is a TiddlyWiki hosted on http://www.tiddlyspot.com. (You can host your own wiki there if you wish.)
!! References
Defn: the engine of this wiki
/***
Description: Contains the stuff you need to use Tiddlyspot
Note, you also need UploadPlugin, PasswordOptionPlugin and LoadRemoteFileThroughProxy
from http://tiddlywiki.bidix.info for a complete working Tiddlyspot site.
***/
//{{{
// edit this if you are migrating sites or retrofitting an existing TW
config.tiddlyspotSiteId = 'dsratefeed';
// make it so you can by default see edit controls via http
config.options.chkHttpReadOnly = false;
window.readOnly = false; // make sure of it (for tw 2.2)
window.showBackstage = true; // show backstage too
// disable autosave in d3
if (window.location.protocol != "file:")
config.options.chkGTDLazyAutoSave = false;
// tweak shadow tiddlers to add upload button, password entry box etc
with (config.shadowTiddlers) {
SiteUrl = 'http://'+config.tiddlyspotSiteId+'.tiddlyspot.com';
SideBarOptions = SideBarOptions.replace(/(<<saveChanges>>)/,"$1<<tiddler TspotSidebar>>");
OptionsPanel = OptionsPanel.replace(/^/,"<<tiddler TspotOptions>>");
DefaultTiddlers = DefaultTiddlers.replace(/^/,"[[WelcomeToTiddlyspot]] ");
MainMenu = MainMenu.replace(/^/,"[[WelcomeToTiddlyspot]] ");
}
// create some shadow tiddler content
merge(config.shadowTiddlers,{
'TspotControls':[
"| tiddlyspot password:|<<option pasUploadPassword>>|",
"| site management:|<<upload http://" + config.tiddlyspotSiteId + ".tiddlyspot.com/store.cgi index.html . . " + config.tiddlyspotSiteId + ">>//(requires tiddlyspot password)//<br>[[control panel|http://" + config.tiddlyspotSiteId + ".tiddlyspot.com/controlpanel]], [[download (go offline)|http://" + config.tiddlyspotSiteId + ".tiddlyspot.com/download]]|",
"| links:|[[tiddlyspot.com|http://tiddlyspot.com/]], [[FAQs|http://faq.tiddlyspot.com/]], [[blog|http://tiddlyspot.blogspot.com/]], email [[support|mailto:support@tiddlyspot.com]] & [[feedback|mailto:feedback@tiddlyspot.com]], [[donate|http://tiddlyspot.com/?page=donate]]|"
].join("\n"),
'TspotOptions':[
"tiddlyspot password:",
"<<option pasUploadPassword>>",
""
].join("\n"),
'TspotSidebar':[
"<<upload http://" + config.tiddlyspotSiteId + ".tiddlyspot.com/store.cgi index.html . . " + config.tiddlyspotSiteId + ">><html><a href='http://" + config.tiddlyspotSiteId + ".tiddlyspot.com/download' class='button'>download</a></html>"
].join("\n"),
'WelcomeToTiddlyspot':[
"This document is a ~TiddlyWiki from tiddlyspot.com. A ~TiddlyWiki is an electronic notebook that is great for managing todo lists, personal information, and all sorts of things.",
"",
"@@font-weight:bold;font-size:1.3em;color:#444; //What now?// @@ Before you can save any changes, you need to enter your password in the form below. Then configure privacy and other site settings at your [[control panel|http://" + config.tiddlyspotSiteId + ".tiddlyspot.com/controlpanel]] (your control panel username is //" + config.tiddlyspotSiteId + "//).",
"<<tiddler TspotControls>>",
"See also GettingStarted.",
"",
"@@font-weight:bold;font-size:1.3em;color:#444; //Working online// @@ You can edit this ~TiddlyWiki right now, and save your changes using the \"save to web\" button in the column on the right.",
"",
"@@font-weight:bold;font-size:1.3em;color:#444; //Working offline// @@ A fully functioning copy of this ~TiddlyWiki can be saved onto your hard drive or USB stick. You can make changes and save them locally without being connected to the Internet. When you're ready to sync up again, just click \"upload\" and your ~TiddlyWiki will be saved back to tiddlyspot.com.",
"",
"@@font-weight:bold;font-size:1.3em;color:#444; //Help!// @@ Find out more about ~TiddlyWiki at [[TiddlyWiki.com|http://tiddlywiki.com]]. Also visit [[TiddlyWiki.org|http://tiddlywiki.org]] for documentation on learning and using ~TiddlyWiki. New users are especially welcome on the [[TiddlyWiki mailing list|http://groups.google.com/group/TiddlyWiki]], which is an excellent place to ask questions and get help. If you have a tiddlyspot related problem email [[tiddlyspot support|mailto:support@tiddlyspot.com]].",
"",
"@@font-weight:bold;font-size:1.3em;color:#444; //Enjoy :)// @@ We hope you like using your tiddlyspot.com site. Please email [[feedback@tiddlyspot.com|mailto:feedback@tiddlyspot.com]] with any comments or suggestions."
].join("\n")
});
//}}}
| !date | !user | !location | !storeUrl | !uploadDir | !toFilename | !backupdir | !origin |
| 18/02/2014 14:29:29 | MauriceManeschi | [[/|http://dsratefeed.tiddlyspot.com/]] | [[store.cgi|http://dsratefeed.tiddlyspot.com/store.cgi]] | . | [[index.html | http://dsratefeed.tiddlyspot.com/index.html]] | . |
| 19/02/2014 15:20:12 | MauriceManeschi | [[/|http://dsratefeed.tiddlyspot.com/]] | [[store.cgi|http://dsratefeed.tiddlyspot.com/store.cgi]] | . | [[index.html | http://dsratefeed.tiddlyspot.com/index.html]] | . | ok |
| 19/02/2014 15:38:47 | MauriceManeschi | [[/|http://dsratefeed.tiddlyspot.com/]] | [[store.cgi|http://dsratefeed.tiddlyspot.com/store.cgi]] | . | [[index.html | http://dsratefeed.tiddlyspot.com/index.html]] | . | ok |
| 19/02/2014 16:19:34 | MauriceManeschi | [[/|http://dsratefeed.tiddlyspot.com/]] | [[store.cgi|http://dsratefeed.tiddlyspot.com/store.cgi]] | . | [[index.html | http://dsratefeed.tiddlyspot.com/index.html]] | . | ok |
| 19/02/2014 16:37:25 | MauriceManeschi | [[/|http://dsratefeed.tiddlyspot.com/]] | [[store.cgi|http://dsratefeed.tiddlyspot.com/store.cgi]] | . | [[index.html | http://dsratefeed.tiddlyspot.com/index.html]] | . | ok |
| 19/02/2014 16:53:37 | MauriceManeschi | [[/|http://dsratefeed.tiddlyspot.com/]] | [[store.cgi|http://dsratefeed.tiddlyspot.com/store.cgi]] | . | [[index.html | http://dsratefeed.tiddlyspot.com/index.html]] | . |
| 20/02/2014 08:12:58 | MauriceManeschi | [[/|http://dsratefeed.tiddlyspot.com/]] | [[store.cgi|http://dsratefeed.tiddlyspot.com/store.cgi]] | . | [[index.html | http://dsratefeed.tiddlyspot.com/index.html]] | . |
| 20/02/2014 08:49:05 | MauriceManeschi | [[/|http://dsratefeed.tiddlyspot.com/]] | [[store.cgi|http://dsratefeed.tiddlyspot.com/store.cgi]] | . | [[index.html | http://dsratefeed.tiddlyspot.com/index.html]] | . | ok |
| 20/02/2014 08:52:51 | MauriceManeschi | [[/|http://dsratefeed.tiddlyspot.com/]] | [[store.cgi|http://dsratefeed.tiddlyspot.com/store.cgi]] | . | [[index.html | http://dsratefeed.tiddlyspot.com/index.html]] | . | ok |
| 20/02/2014 08:53:55 | MauriceManeschi | [[/|http://dsratefeed.tiddlyspot.com/]] | [[store.cgi|http://dsratefeed.tiddlyspot.com/store.cgi]] | . | [[index.html | http://dsratefeed.tiddlyspot.com/index.html]] | . |
/***
|''Name:''|UploadPlugin|
|''Description:''|Save to web a TiddlyWiki|
|''Version:''|4.1.3|
|''Date:''|Feb 24, 2008|
|''Source:''|http://tiddlywiki.bidix.info/#UploadPlugin|
|''Documentation:''|http://tiddlywiki.bidix.info/#UploadPluginDoc|
|''Author:''|BidiX (BidiX (at) bidix (dot) info)|
|''License:''|[[BSD open source license|http://tiddlywiki.bidix.info/#%5B%5BBSD%20open%20source%20license%5D%5D ]]|
|''~CoreVersion:''|2.2.0|
|''Requires:''|PasswordOptionPlugin|
***/
//{{{
version.extensions.UploadPlugin = {
major: 4, minor: 1, revision: 3,
date: new Date("Feb 24, 2008"),
source: 'http://tiddlywiki.bidix.info/#UploadPlugin',
author: 'BidiX (BidiX (at) bidix (dot) info',
coreVersion: '2.2.0'
};
//
// Environment
//
if (!window.bidix) window.bidix = {}; // bidix namespace
bidix.debugMode = false; // true to activate both in Plugin and UploadService
//
// Upload Macro
//
config.macros.upload = {
// default values
defaultBackupDir: '', //no backup
defaultStoreScript: "store.php",
defaultToFilename: "index.html",
defaultUploadDir: ".",
authenticateUser: true // UploadService Authenticate User
};
config.macros.upload.label = {
promptOption: "Save and Upload this TiddlyWiki with UploadOptions",
promptParamMacro: "Save and Upload this TiddlyWiki in %0",
saveLabel: "save to web",
saveToDisk: "save to disk",
uploadLabel: "upload"
};
config.macros.upload.messages = {
noStoreUrl: "No store URL in parmeters or options",
usernameOrPasswordMissing: "Username or password missing"
};
config.macros.upload.handler = function(place,macroName,params) {
if (readOnly)
return;
var label;
if (document.location.toString().substr(0,4) == "http")
label = this.label.saveLabel;
else
label = this.label.uploadLabel;
var prompt;
if (params[0]) {
prompt = this.label.promptParamMacro.toString().format([this.destFile(params[0],
(params[1] ? params[1]:bidix.basename(window.location.toString())), params[3])]);
} else {
prompt = this.label.promptOption;
}
createTiddlyButton(place, label, prompt, function() {config.macros.upload.action(params);}, null, null, this.accessKey);
};
config.macros.upload.action = function(params)
{
// for missing macro parameter set value from options
if (!params) params = {};
var storeUrl = params[0] ? params[0] : config.options.txtUploadStoreUrl;
var toFilename = params[1] ? params[1] : config.options.txtUploadFilename;
var backupDir = params[2] ? params[2] : config.options.txtUploadBackupDir;
var uploadDir = params[3] ? params[3] : config.options.txtUploadDir;
var username = params[4] ? params[4] : config.options.txtUploadUserName;
var password = config.options.pasUploadPassword; // for security reason no password as macro parameter
// for still missing parameter set default value
if ((!storeUrl) && (document.location.toString().substr(0,4) == "http"))
storeUrl = bidix.dirname(document.location.toString())+'/'+config.macros.upload.defaultStoreScript;
if (storeUrl.substr(0,4) != "http")
storeUrl = bidix.dirname(document.location.toString()) +'/'+ storeUrl;
if (!toFilename)
toFilename = bidix.basename(window.location.toString());
if (!toFilename)
toFilename = config.macros.upload.defaultToFilename;
if (!uploadDir)
uploadDir = config.macros.upload.defaultUploadDir;
if (!backupDir)
backupDir = config.macros.upload.defaultBackupDir;
// report error if still missing
if (!storeUrl) {
alert(config.macros.upload.messages.noStoreUrl);
clearMessage();
return false;
}
if (config.macros.upload.authenticateUser && (!username || !password)) {
alert(config.macros.upload.messages.usernameOrPasswordMissing);
clearMessage();
return false;
}
bidix.upload.uploadChanges(false,null,storeUrl, toFilename, uploadDir, backupDir, username, password);
return false;
};
config.macros.upload.destFile = function(storeUrl, toFilename, uploadDir)
{
if (!storeUrl)
return null;
var dest = bidix.dirname(storeUrl);
if (uploadDir && uploadDir != '.')
dest = dest + '/' + uploadDir;
dest = dest + '/' + toFilename;
return dest;
};
//
// uploadOptions Macro
//
config.macros.uploadOptions = {
handler: function(place,macroName,params) {
var wizard = new Wizard();
wizard.createWizard(place,this.wizardTitle);
wizard.addStep(this.step1Title,this.step1Html);
var markList = wizard.getElement("markList");
var listWrapper = document.createElement("div");
markList.parentNode.insertBefore(listWrapper,markList);
wizard.setValue("listWrapper",listWrapper);
this.refreshOptions(listWrapper,false);
var uploadCaption;
if (document.location.toString().substr(0,4) == "http")
uploadCaption = config.macros.upload.label.saveLabel;
else
uploadCaption = config.macros.upload.label.uploadLabel;
wizard.setButtons([
{caption: uploadCaption, tooltip: config.macros.upload.label.promptOption,
onClick: config.macros.upload.action},
{caption: this.cancelButton, tooltip: this.cancelButtonPrompt, onClick: this.onCancel}
]);
},
options: [
"txtUploadUserName",
"pasUploadPassword",
"txtUploadStoreUrl",
"txtUploadDir",
"txtUploadFilename",
"txtUploadBackupDir",
"chkUploadLog",
"txtUploadLogMaxLine"
],
refreshOptions: function(listWrapper) {
var opts = [];
for(i=0; i<this.options.length; i++) {
var opt = {};
opts.push();
opt.option = "";
n = this.options[i];
opt.name = n;
opt.lowlight = !config.optionsDesc[n];
opt.description = opt.lowlight ? this.unknownDescription : config.optionsDesc[n];
opts.push(opt);
}
var listview = ListView.create(listWrapper,opts,this.listViewTemplate);
for(n=0; n<opts.length; n++) {
var type = opts[n].name.substr(0,3);
var h = config.macros.option.types[type];
if (h && h.create) {
h.create(opts[n].colElements['option'],type,opts[n].name,opts[n].name,"no");
}
}
},
onCancel: function(e)
{
backstage.switchTab(null);
return false;
},
wizardTitle: "Upload with options",
step1Title: "These options are saved in cookies in your browser",
step1Html: "<input type='hidden' name='markList'></input><br>",
cancelButton: "Cancel",
cancelButtonPrompt: "Cancel prompt",
listViewTemplate: {
columns: [
{name: 'Description', field: 'description', title: "Description", type: 'WikiText'},
{name: 'Option', field: 'option', title: "Option", type: 'String'},
{name: 'Name', field: 'name', title: "Name", type: 'String'}
],
rowClasses: [
{className: 'lowlight', field: 'lowlight'}
]}
};
//
// upload functions
//
if (!bidix.upload) bidix.upload = {};
if (!bidix.upload.messages) bidix.upload.messages = {
//from saving
invalidFileError: "The original file '%0' does not appear to be a valid TiddlyWiki",
backupSaved: "Backup saved",
backupFailed: "Failed to upload backup file",
rssSaved: "RSS feed uploaded",
rssFailed: "Failed to upload RSS feed file",
emptySaved: "Empty template uploaded",
emptyFailed: "Failed to upload empty template file",
mainSaved: "Main TiddlyWiki file uploaded",
mainFailed: "Failed to upload main TiddlyWiki file. Your changes have not been saved",
//specific upload
loadOriginalHttpPostError: "Can't get original file",
aboutToSaveOnHttpPost: 'About to upload on %0 ...',
storePhpNotFound: "The store script '%0' was not found."
};
bidix.upload.uploadChanges = function(onlyIfDirty,tiddlers,storeUrl,toFilename,uploadDir,backupDir,username,password)
{
var callback = function(status,uploadParams,original,url,xhr) {
if (!status) {
displayMessage(bidix.upload.messages.loadOriginalHttpPostError);
return;
}
if (bidix.debugMode)
alert(original.substr(0,500)+"\n...");
// Locate the storeArea div's
var posDiv = locateStoreArea(original);
if((posDiv[0] == -1) || (posDiv[1] == -1)) {
alert(config.messages.invalidFileError.format([localPath]));
return;
}
bidix.upload.uploadRss(uploadParams,original,posDiv);
};
if(onlyIfDirty && !store.isDirty())
return;
clearMessage();
// save on localdisk ?
if (document.location.toString().substr(0,4) == "file") {
var path = document.location.toString();
var localPath = getLocalPath(path);
saveChanges();
}
// get original
var uploadParams = new Array(storeUrl,toFilename,uploadDir,backupDir,username,password);
var originalPath = document.location.toString();
// If url is a directory : add index.html
if (originalPath.charAt(originalPath.length-1) == "/")
originalPath = originalPath + "index.html";
var dest = config.macros.upload.destFile(storeUrl,toFilename,uploadDir);
var log = new bidix.UploadLog();
log.startUpload(storeUrl, dest, uploadDir, backupDir);
displayMessage(bidix.upload.messages.aboutToSaveOnHttpPost.format([dest]));
if (bidix.debugMode)
alert("about to execute Http - GET on "+originalPath);
var r = doHttp("GET",originalPath,null,null,username,password,callback,uploadParams,null);
if (typeof r == "string")
displayMessage(r);
return r;
};
bidix.upload.uploadRss = function(uploadParams,original,posDiv)
{
var callback = function(status,params,responseText,url,xhr) {
if(status) {
var destfile = responseText.substring(responseText.indexOf("destfile:")+9,responseText.indexOf("\n", responseText.indexOf("destfile:")));
displayMessage(bidix.upload.messages.rssSaved,bidix.dirname(url)+'/'+destfile);
bidix.upload.uploadMain(params[0],params[1],params[2]);
} else {
displayMessage(bidix.upload.messages.rssFailed);
}
};
// do uploadRss
if(config.options.chkGenerateAnRssFeed) {
var rssPath = uploadParams[1].substr(0,uploadParams[1].lastIndexOf(".")) + ".xml";
var rssUploadParams = new Array(uploadParams[0],rssPath,uploadParams[2],'',uploadParams[4],uploadParams[5]);
var rssString = generateRss();
// no UnicodeToUTF8 conversion needed when location is "file" !!!
if (document.location.toString().substr(0,4) != "file")
rssString = convertUnicodeToUTF8(rssString);
bidix.upload.httpUpload(rssUploadParams,rssString,callback,Array(uploadParams,original,posDiv));
} else {
bidix.upload.uploadMain(uploadParams,original,posDiv);
}
};
bidix.upload.uploadMain = function(uploadParams,original,posDiv)
{
var callback = function(status,params,responseText,url,xhr) {
var log = new bidix.UploadLog();
if(status) {
// if backupDir specified
if ((params[3]) && (responseText.indexOf("backupfile:") > -1)) {
var backupfile = responseText.substring(responseText.indexOf("backupfile:")+11,responseText.indexOf("\n", responseText.indexOf("backupfile:")));
displayMessage(bidix.upload.messages.backupSaved,bidix.dirname(url)+'/'+backupfile);
}
var destfile = responseText.substring(responseText.indexOf("destfile:")+9,responseText.indexOf("\n", responseText.indexOf("destfile:")));
displayMessage(bidix.upload.messages.mainSaved,bidix.dirname(url)+'/'+destfile);
store.setDirty(false);
log.endUpload("ok");
} else {
alert(bidix.upload.messages.mainFailed);
displayMessage(bidix.upload.messages.mainFailed);
log.endUpload("failed");
}
};
// do uploadMain
var revised = bidix.upload.updateOriginal(original,posDiv);
bidix.upload.httpUpload(uploadParams,revised,callback,uploadParams);
};
bidix.upload.httpUpload = function(uploadParams,data,callback,params)
{
var localCallback = function(status,params,responseText,url,xhr) {
url = (url.indexOf("nocache=") < 0 ? url : url.substring(0,url.indexOf("nocache=")-1));
if (xhr.status == 404)
alert(bidix.upload.messages.storePhpNotFound.format([url]));
if ((bidix.debugMode) || (responseText.indexOf("Debug mode") >= 0 )) {
alert(responseText);
if (responseText.indexOf("Debug mode") >= 0 )
responseText = responseText.substring(responseText.indexOf("\n\n")+2);
} else if (responseText.charAt(0) != '0')
alert(responseText);
if (responseText.charAt(0) != '0')
status = null;
callback(status,params,responseText,url,xhr);
};
// do httpUpload
var boundary = "---------------------------"+"AaB03x";
var uploadFormName = "UploadPlugin";
// compose headers data
var sheader = "";
sheader += "--" + boundary + "\r\nContent-disposition: form-data; name=\"";
sheader += uploadFormName +"\"\r\n\r\n";
sheader += "backupDir="+uploadParams[3] +
";user=" + uploadParams[4] +
";password=" + uploadParams[5] +
";uploaddir=" + uploadParams[2];
if (bidix.debugMode)
sheader += ";debug=1";
sheader += ";;\r\n";
sheader += "\r\n" + "--" + boundary + "\r\n";
sheader += "Content-disposition: form-data; name=\"userfile\"; filename=\""+uploadParams[1]+"\"\r\n";
sheader += "Content-Type: text/html;charset=UTF-8" + "\r\n";
sheader += "Content-Length: " + data.length + "\r\n\r\n";
// compose trailer data
var strailer = new String();
strailer = "\r\n--" + boundary + "--\r\n";
data = sheader + data + strailer;
if (bidix.debugMode) alert("about to execute Http - POST on "+uploadParams[0]+"\n with \n"+data.substr(0,500)+ " ... ");
var r = doHttp("POST",uploadParams[0],data,"multipart/form-data; ;charset=UTF-8; boundary="+boundary,uploadParams[4],uploadParams[5],localCallback,params,null);
if (typeof r == "string")
displayMessage(r);
return r;
};
// same as Saving's updateOriginal but without convertUnicodeToUTF8 calls
bidix.upload.updateOriginal = function(original, posDiv)
{
if (!posDiv)
posDiv = locateStoreArea(original);
if((posDiv[0] == -1) || (posDiv[1] == -1)) {
alert(config.messages.invalidFileError.format([localPath]));
return;
}
var revised = original.substr(0,posDiv[0] + startSaveArea.length) + "\n" +
store.allTiddlersAsHtml() + "\n" +
original.substr(posDiv[1]);
var newSiteTitle = getPageTitle().htmlEncode();
revised = revised.replaceChunk("<title"+">","</title"+">"," " + newSiteTitle + " ");
revised = updateMarkupBlock(revised,"PRE-HEAD","MarkupPreHead");
revised = updateMarkupBlock(revised,"POST-HEAD","MarkupPostHead");
revised = updateMarkupBlock(revised,"PRE-BODY","MarkupPreBody");
revised = updateMarkupBlock(revised,"POST-SCRIPT","MarkupPostBody");
return revised;
};
//
// UploadLog
//
// config.options.chkUploadLog :
// false : no logging
// true : logging
// config.options.txtUploadLogMaxLine :
// -1 : no limit
// 0 : no Log lines but UploadLog is still in place
// n : the last n lines are only kept
// NaN : no limit (-1)
bidix.UploadLog = function() {
if (!config.options.chkUploadLog)
return; // this.tiddler = null
this.tiddler = store.getTiddler("UploadLog");
if (!this.tiddler) {
this.tiddler = new Tiddler();
this.tiddler.title = "UploadLog";
this.tiddler.text = "| !date | !user | !location | !storeUrl | !uploadDir | !toFilename | !backupdir | !origin |";
this.tiddler.created = new Date();
this.tiddler.modifier = config.options.txtUserName;
this.tiddler.modified = new Date();
store.addTiddler(this.tiddler);
}
return this;
};
bidix.UploadLog.prototype.addText = function(text) {
if (!this.tiddler)
return;
// retrieve maxLine when we need it
var maxLine = parseInt(config.options.txtUploadLogMaxLine,10);
if (isNaN(maxLine))
maxLine = -1;
// add text
if (maxLine != 0)
this.tiddler.text = this.tiddler.text + text;
// Trunck to maxLine
if (maxLine >= 0) {
var textArray = this.tiddler.text.split('\n');
if (textArray.length > maxLine + 1)
textArray.splice(1,textArray.length-1-maxLine);
this.tiddler.text = textArray.join('\n');
}
// update tiddler fields
this.tiddler.modifier = config.options.txtUserName;
this.tiddler.modified = new Date();
store.addTiddler(this.tiddler);
// refresh and notifiy for immediate update
story.refreshTiddler(this.tiddler.title);
store.notify(this.tiddler.title, true);
};
bidix.UploadLog.prototype.startUpload = function(storeUrl, toFilename, uploadDir, backupDir) {
if (!this.tiddler)
return;
var now = new Date();
var text = "\n| ";
var filename = bidix.basename(document.location.toString());
if (!filename) filename = '/';
text += now.formatString("0DD/0MM/YYYY 0hh:0mm:0ss") +" | ";
text += config.options.txtUserName + " | ";
text += "[["+filename+"|"+location + "]] |";
text += " [[" + bidix.basename(storeUrl) + "|" + storeUrl + "]] | ";
text += uploadDir + " | ";
text += "[[" + bidix.basename(toFilename) + " | " +toFilename + "]] | ";
text += backupDir + " |";
this.addText(text);
};
bidix.UploadLog.prototype.endUpload = function(status) {
if (!this.tiddler)
return;
this.addText(" "+status+" |");
};
//
// Utilities
//
bidix.checkPlugin = function(plugin, major, minor, revision) {
var ext = version.extensions[plugin];
if (!
(ext &&
((ext.major > major) ||
((ext.major == major) && (ext.minor > minor)) ||
((ext.major == major) && (ext.minor == minor) && (ext.revision >= revision))))) {
// write error in PluginManager
if (pluginInfo)
pluginInfo.log.push("Requires " + plugin + " " + major + "." + minor + "." + revision);
eval(plugin); // generate an error : "Error: ReferenceError: xxxx is not defined"
}
};
bidix.dirname = function(filePath) {
if (!filePath)
return;
var lastpos;
if ((lastpos = filePath.lastIndexOf("/")) != -1) {
return filePath.substring(0, lastpos);
} else {
return filePath.substring(0, filePath.lastIndexOf("\\"));
}
};
bidix.basename = function(filePath) {
if (!filePath)
return;
var lastpos;
if ((lastpos = filePath.lastIndexOf("#")) != -1)
filePath = filePath.substring(0, lastpos);
if ((lastpos = filePath.lastIndexOf("/")) != -1) {
return filePath.substring(lastpos + 1);
} else
return filePath.substring(filePath.lastIndexOf("\\")+1);
};
bidix.initOption = function(name,value) {
if (!config.options[name])
config.options[name] = value;
};
//
// Initializations
//
// require PasswordOptionPlugin 1.0.1 or better
bidix.checkPlugin("PasswordOptionPlugin", 1, 0, 1);
// styleSheet
setStylesheet('.txtUploadStoreUrl, .txtUploadBackupDir, .txtUploadDir {width: 22em;}',"uploadPluginStyles");
//optionsDesc
merge(config.optionsDesc,{
txtUploadStoreUrl: "Url of the UploadService script (default: store.php)",
txtUploadFilename: "Filename of the uploaded file (default: in index.html)",
txtUploadDir: "Relative Directory where to store the file (default: . (downloadService directory))",
txtUploadBackupDir: "Relative Directory where to backup the file. If empty no backup. (default: ''(empty))",
txtUploadUserName: "Upload Username",
pasUploadPassword: "Upload Password",
chkUploadLog: "do Logging in UploadLog (default: true)",
txtUploadLogMaxLine: "Maximum of lines in UploadLog (default: 10)"
});
// Options Initializations
bidix.initOption('txtUploadStoreUrl','');
bidix.initOption('txtUploadFilename','');
bidix.initOption('txtUploadDir','');
bidix.initOption('txtUploadBackupDir','');
bidix.initOption('txtUploadUserName','');
bidix.initOption('pasUploadPassword','');
bidix.initOption('chkUploadLog',true);
bidix.initOption('txtUploadLogMaxLine','10');
// Backstage
merge(config.tasks,{
uploadOptions: {text: "upload", tooltip: "Change UploadOptions and Upload", content: '<<uploadOptions>>'}
});
config.backstageTasks.push("uploadOptions");
//}}}
A Yield Curve is a series of bid and offer rates, each at a different time band, that indicates the theoretic borrowing or lending rate for a currency (or set of instruments within a currency).
OakvaleCapital constructed Yield Curves with a [[Cash Curve]] and a [[Swaps Curve]]. (Other institutions used a Futures Curve as well.) The [[Cash Curve]] generally run from Overnight to 6 Months with one payment at maturity. The [[Swaps Curve]] generally runs from 1 Year until the end of the curve with payments at the shown frequencies. For some currencies (e.g. GBP), the Cash Curve includes a 1 Year time band and the Swaps Curve starts from 2 Years.
!! References
Defn: A series of bid and offers rates that indicate the theoretic borrowing or lending rate for a currency
<<tiddler [[Yield Curve]]>>
The [[curves database]] is an SQLite database used by the DataScopeRateFeed and the QuantumRateFeed. It's production copy is located in the {{{Rate Feed}}} subfolder of the treasury folder ({{{\\oakrmsmscs.oakrms.local\iRMS\Treasury\Rate Feed\curves.sqlite}}}. It is backed up as part of the hosting services provided by [[SunGard]].
The file is quite large, typically 200 Mb, as it contains every curve built in the last 90 days (or as configured in the [[DataScope RateFeed configuration files|DataScope RateFeed/Configuration]]).
!! Structure
!!! test
The '''test''' is used by DataScopeRateFeed to confirm it has write access to the [[curves database]]
----
!!! log
The '''log''' table contains an entry for each log message from the DataScopeRateFeed. As most of these messages are repetitive, the text is kept in the logMessage folder. Also, to keep the file size from getting too big, the ''msgDt'' is the number of seconds since 1/1/2000.
* ''status'' is
*: '''e''' ⇒ Error
*: '''w''' ⇒ Warning
*: '''i''' ⇒ Information
** When "information", the ''level'' indicates the verbosity. Generally, you do not want to see messages below level 1.
!!! logMessages
The '''logMessages''' table contains the text of the messages written the ''log'' above. These messages are often repetitive, so messages (less than one line) have one instance in this table and multiple references from ''log''s.
!!! jobNames
The '''jobNames''' table contains the names of the jobs which write to the ''log'' table. As there are only a few, it greatly reduces table sizes and speeds up queries.
----
!!! curves
The '''curves''' table links the curves names (as defined in the [[DataScope RateFeed/Configuration]]) with the actual curves built.
* The ''in_use'' field indicates previous;ly defined curves that have been removed (in the last 90 days) from the configuration.
* The ''histCurveId'' indicate the current "live" curve (last built without errors)
!!! qtomFields
The '''qtomFields''' holds data required to write curves to [[Quantum]] via [[QTOM]], but not explicit in the curve tables ''fxrates'', ''intrates'' and ''volhdr''.
!!! historicalCurves
Each time a curve is built by DataScopeRateFeed, it adds a row to the '''historicalCurves''' table.
* If ''errorId'' is not zero, this curve will not be used by QuantumRateFeed. (It is kept for analysis purposes.
* The ''input_dt'' reflects the update date of the most recently update [[RIC]] (from the [[pool database]]) that was used to build his curve.
!!! curveErrors
If there is an error when building a curve, it is written to this ''curveErrors'' table and linked to ''historicalCurves''. Several errors are linked via a newline character.
!!! eodCurves
The '''eodCurves''' table is used to flag which ''historicalCurves'' were live when their "End of Day" job ran (as defined with the ''EOD'' field in the [[Configuration Files]]).
* The ''seqno'' is used by QuantumRateFeed to know when a curve has been flagged as "End of Day" by incrementing ad infinitum
* The maximum value of this field is 9,223,372,036,854,775,807. Assuming 400 [[End of Day]] curves a day (double the current number), this ''seqno'' will need to be reset once every 63 trillion years.
* Note that a single curve could be flagged in more than once in '''eodCurves'''. This reflects that is was the most current curve during the "End of Day" run, but we know it will fail on insert by the QuantumRateFeed (unless deleted from [[Quantum]] first).
!!! eodLog
The '''eodLog''' table shows the number of curves that have been flagged in each "End of Day" run. it is used by QuantumRateFeed to "push End of Day".
----
!!! fxrates
Corresponds to fxrates in the [[Quantum]] database
!!! fwdpts
Corresponds to fwdpts in the [[Quantum]] database
!!! intrates
Corresponds to intrates in the [[Quantum]] database
!!! volhdr
Corresponds to volhdr in the [[Quantum]] database
!!! volrates
Corresponds to volrates in the [[Quantum]] database
!!! margins
Margins are stored in intmarg in the [[Quantum]] database. However, this does not suit the margins used in DataScopeRateFeed, none of which are written into [[Quantum]]. ''margins'' is a cut down version of intrates.
----
!!! oldCurves
The '''oldCurves''' table holds temporary data used by the Daily Maintenance process
!!! oldErrors
The '''oldErrors''' table holds temporary data used by the Daily Maintenance process
!!! sqlite_stat1
The '''sqlite_stat1''' table is an SQLite internal table and used by the Daily Maintenance process
!! Maintenance
The DataScopeRateFeed removes old rows and performs a "vacuum" at 3:00am each morning. Old rows are defined as curves and messages older than 90 days (or as defined in section "Curve data" of the [[DataScope RateFeed/Configuration]].
You can perform other maintenance and adhoc enquiries by using the "~SQLite Manager" extension to Firefox, but take a backup and create a workitem before doing this.
!! Reading the log table
Generally you ask the [[RateFeed Dashboard]] for snapshots or exports of the log file. However, you can interrogate the table directly as follows:
{{{
import sqlite3,datetime
START_TIME = datetime.datetime(1970,1,1) # ref CurveData in dsDatabase.py
def asSecs(dt):
"""Converts a date to seconds since v (1/1/1970)"""
delta = dt - START_TIME
return delta.days * 86400 + delta.seconds
def asDate(secs):
return START_TIME + datetime.timedelta(seconds = secs)
db = sqlite3.connect(r'c:\Documents and Settings\MauriceM\My Documents\curves.sqlite')
print (asDate(db.execute('select max(msgDt) from log').fetchone()[0]))
sql = 'select msgDt, status, level, j.job, m.message from log l ' + \
'join jobNames j on l.jobId = j.jobId ' + \
'join logMessages m on l.messageId = m.messageId and message like \'Ric MRF%not use%\' ' + \
'where msgDt > ? order by msgDt asc'
cursor = db.execute(sql, (asSecs(datetime.datetime(2011,9,1)),))
for row in cursor.fetchall():
row = list(row)
row [0] = asDate(row[0]).strftime('%d-%b-%y %H:%M')
print('\t'.join(map(str,row)))
db.close()
}}}
!! Design
The structure of the [[curves database]] is based heavily on the of [[QTProduction]], despite that not being optimal. The intention of this decision is to allow easy analysis of the data between [[QTProduction]] and the [[curves database]]. It also allowed the QuantumRateFeed to be a simple enhancement of an easier testing script that read the production database and wrote the rates to a test server.
@@background-color:#EEF;''Note'': In theory, this means you could run the [[RateCheckerEOD.py]] against the [[curves database]]. In practice, this would not prove anything.@@
The key difference between the rate curves in [[QTProduction]] and this [[curves database]] is that [[QTProduction]] has only one live rate curve per day while [[curves database]] has every live rate curve built. The [[curves database]] does this by:
* replacing ''thekey'' with ''histCurveId'' which links into the historicalCurves
* ''rateType'' is always "Live" and "eodCurves" flags which of the these live curves is "End of Day"
** This is because DataScopeRateFeed builds a curve for each related ric update
** At the [[End of Day]] event, the current live curve becomes the "End of Day" curve, thus one ccurve is both.
** If a curve does not update for several days (over a weekend or holiday), it is the [[End of Day]] curve for each of those days.
* The ''input_dt'' in ''intrates'', ''fxrates'' and ''volhdr'' reflects the actual time the curve was built
* The ''input_dt'' in ''historicalCurves'' reflects the update time of the most recently updated RIC in the [[pool database]]
* The ''histCurveId'' in ''curves'' reflects the current "Live" curve (that has no errors in its calculations)
!! References
* [[RateFeed]]
** DataScopeRateFeed
** QuantumRateFeed
* [[QTProduction]]
dsConfiguration.py is a module of DataScopeRateFeed.
!!References
Defn: a module of DataScopeRateFeed.
dsFormula.py is a module of DataScopeRateFeed.
!!References
Defn: a module of DataScopeRateFeed.
<<tiddler [[End of Day]]>>
The End of Month Rates are mostly the same as [[end of day]] rates for the last calendar day of the month. However, some curves that have an effective date which is calculated from a [[RIC]] may not have an end of day rate on that last calendar day, if that day is not a business day for the source exchange. In that case, the End of Month Rate will be a day or two earlier. (For example, the USD Yield Curve End of Month Rate for July 2011 is found on the 29 July 2011.)
With regard to the RateFeed process, month end effectively starts by about 3:00pm on the last calendar day of the month and runs until mid morning after the first trading day of the next month. From that point, the RateFeed applications should not be touched as they are feeding rates into the system.
!! References
* [[End of Day]]
Defn: an accounting notion - when the [[Rates]] are used to post pricing into the General Ledger
The [[pool database]] is an SQLite database used by the DataScopeRateFeed. It's production copy is located in the {{{Rate Feed}}} subfolder of the treasury folder ({{{\\oakrmsmscs.oakrms.local\iRMS\Treasury\Rate Feed\pool.sqlite}}}). It is backed up as part of the hosting services provided by [[SunGard]].
The file is of middling size, typically 50 Mb, as it contains [[RIC]] data in from the last 31 days (or as configured in the [[DataScope RateFeed configuration files|DataScope RateFeed/Configuration]]).
@@background-color:#EEF;''Note'': The inputDt in the snaps table will be the insert time unless the snap comes from DataScope, in which case it will be the time stamp of the csv file.@@
!! Structure
!!! test
The '''test''' is used by DataScopeRateFeed to confirm it has write access to the [[pool database]]
----
!!! rates
The '''rates''' table contains the field data of every snap of every instrument (from [[Reuters Datascope Select]] and the [[Manual RateFeed]]) in the last 31 days (or as configured).
* The ''snapId'' field links to the ''snaps'' table and clarifies the origin of the rate
* The ''rateId'' is an internal primary key
* The ''errors'' field contains any errors associated with the snap of the instrument
* The remaining fields are those provided by the data sources. As a new field is added to a data source, it will appear in this table (with spaces replaced with capital letters)
** Generally these fields are null if the field is not relevant to the instrument (for example, the bidYield is null for the AUD/USD spot "AUD=")
!!! currentRates
The '''currentRates''' table is a utility table that contains one row per [[RIC]]. It is the most up-to-date row (from ''rates'') that does not have errors. It has the same fields as ''rates'' except the rateId.
----
!!! snaps
Each time a "bunch of rates" arrive, the DataScopeRateFeed records a row in this table and links the rates (in ''rates'') via a ''snapId''. The table has:
* The ''totalInstruments'' expected in the snap
* The ''numInstruments'' received in the snap. This is the same as ''totalInstruments'' unless some instruments had errors (or did not exist).
* The ''embargo'' field indicates if the snap has embargoed instruments
* The ''inputDt'' was the date-time the snap was actually entered into the [[pool database]]
* The ''rateDt'' is the date-time of the snap. Typically, this is the time-stamp of the file in [[Reuters Datascope Select]].
* The ''snapTypeId'' links this record to the ''snapTypes'' table
!!! snapTypes
The '''snapTypes''' table is used to record common information about snaps. Most [[Reuters Datascope Select]] snaps will be records here. When an unexpected snap is received, DataScopeRateFeed creates an "ad hoc" row in this database:
* ''in_use'' = "N" ⇒ this is a ad hoc snap, or a snap no longer in use
* The ''totalInstruments'' expected in the snap
* The ''embargo'' field indicates if the snap is expected to be embargoed
!!! snapFiles
The '''snapFiles''' table has a row for every file (on [[Reuters Datascope Select]]) that has been processed. Where is was successfully processed by a snap, the ''snapId'' links it. If ''snapId'' is "-1", it has been ignored for some reason
!!! ricMaintenance
The '''ricMaintenance''' table is used to record any entries found in the ".rics.csv" files found on [[Reuters Datascope Select]]. Generally these files are empty, so there should not be much in the file. See the [file:///S:\RMS\Documents\Rates%20&%20RateFeed\DataScope%20Select%20-%20FTP%20User%20Guide.pdf DataScope Select - FTP User Guide.pdf] for an explanation of its columns.
@@background-color:#EEF;''Note'': At present there is no [[RMS]] action on entries appearing here. I envisage that behaviours may be added to future versions of the DataScopeRateFeed to handle certain types of entries here.@@
----
!!! oldSnaps
The '''oldSnaps''' table holds temporary data used in the daily maintenance of the [[pool database]].
!! References
* [[DataScopeRateFeed]]
* [[curves database]]