Higher quality data from the front office?
Sungard had a good event on Thursday night, with four risk managers taking the stage for a "thought leadership" seminar entitled "Regulatory Impact of Market Events" (if the advert is still around on their site, see http://www.sungard.com/ADAPTIV/default.aspx?id=4678&formAction=takeit&formid=48)
The Dresdner risk manager (Ted Macdonald, good speaker) was emphasising that data quality is a real issue for risk management, and that all participants thought that risk managers should spend more time on risk and less on validating/cleaning data (no great surprises there then but interesting to hear it validated again as an issue).
He suggested that more pressure should be put on the front office to get data right first time (as opposed to leaving everyone else to sort out the mess!), even going so far as to suggest that charging the front office for each wrongly-booked trade in the trading and risk management systems – not sure how that would go down with the trading desks, but sounds a good approach if you could agree (and unambiguously measure) these mistakes!
Seems like transfer-costing is becoming a re-occurring theme – also recently mentioned by a grid computing specialist from Credit Suisse about "metering" each desk for the amount of compute power used…anyone retraining as a management accountant out there? – sounds like the banks will be hiring soon!