Wednesday, July 23, 2014

The Trading's Trailing Off

There's been some generally miserable news floating about about big bank trading levels.

In a Forbes article, the team at Trefis put lower trading revenues down to "an overall reduction in trading activity over the period (a temporary factor) and a reduction in total market size as a direct result of stricter regulations (a permanent factor)."

Meanwhile, a number of media outlets were quick to cover the fact that trading levels within Barclays' dark pool has declined an incredible 66% during the week ending June 30th, versus the prior week.

We investigated this a little further and found that the week ending June 30th was not a good one for any of the alternative trading systems (ATSs).  It's not necessarily (or only) that Barclays' former clients may have been aggrieved at certain claims or findings made public in the NY Attorney General's complaint filing against Barclays, as could be inferred from a strict reading of some of the coverage -- but that trading levels at ATSs declined generally, with overall trading levels off 25% across the board, and by a median of roughly 20% across all ATSs.  The numbers are still in the same region even if we control for smaller ATSs, by only looking at the 15 largest ATSs as measured by share-trading volume for the week ending June 9.

Banking pundits will be hoping this is only midsummer madness, or maybe due to interim distractions from the Football World Cup.  The week ending June 30th coincided with the final week of group games.

Anyhow, here are the numbers from our extraction of aggregated trade data reported by ATSs to FINRA pursuant to Rule 4552.

Thursday, June 19, 2014

We'll Promise You Best Execution -- For Us (That Is)

The high-frequency investigations just heated up a notch this week, with some choice testimony coming from the grillings on Capitol Hill.

We previously had investigations (SEC, FBI, DOJ, NY AG) into the activities within dark pools, and they were mostly concerned with whether information about some parties was being disclosed to other parties, when it might have been expected to be hidden.  There were also more general concern about whether the game had changed in such a way as to make it easy for the high frequency trading firms (the HFTs) to game the "ordinary" investor. 

The recent hearings, held by the Permanent Subcommittee on Investigations, have now honed in on the all-important question of whether online or discount brokers are appropriately routing their customers' order flow in the best interests of the customer.

As highlighted in a WSJ article, there seems to be an indication that, at least for TD Ameritrade, the flow went to the exchange that was most likely to produce the highest refund (or revenue gain) to TD.

We haven't as yet been able to track down the transcript of the hearing (aside from the written statements, that is) but the snippet on the right seems to corroborate the WSJ's coverage:

"Mr. Levin asked [TD Ameritrade's] Mr. Quirk whether the firm's routing decisions "virtually always led you to route orders to the markets that paid you the most."
"Virtually, yeah," Mr. Quirk replied.

TD Ameritrade had already been feeling the heat from some clients who recently learned how much TD made from selling order flow to Citadel and Knight Capital. This testimony, we imagine, won't help any!

Wednesday, April 9, 2014

Are "Dark Pool" Probes Pending?

Banks' so-called "dark pool" trading venues are all the rage these days.

The media jumped when dark pools were cited in federal authorities' and investor class action complaints against SAC Capital and its executives. The focus was on how the anonymity associated with dark pools, and the levels of secrecy they provide, allowed SAC and/or its members to avoid detection and potential losses on its sale of stock. (See also Gazing into 'dark pools,' the tool that enables anonymous insider trading)

One quote from a complaint reads:
“We executed a sale of over 10.5 million ELN for [various portfolios at CR Intrinsic and SAC LP] at an avg price of 34.21. This was executed quietly and efficiently over a 4 day period through algos and darkpools and booked into two firm accounts that have very limited viewing access.”
Next Goldies brought its dark pool, Sigma X, to the fore.  Having discovered pricing errors within the opaque pool, Goldman reportedly decided to send refund checks to customers to compensate them for the mistakes.

Michael Lewis didn't make matters any easier for Goldman or dark pools, giving them a hard time in his new book, Flash Boys.  Among other things, he casts doubt on whether investors got "best execution" through the dark pools:
“A broker was expected to find the best possible price in the market for his customer. The Goldman Sachs dark pool—to take one example—was less than 2 percent of the entire market. So why did nearly 50 percent of the customer orders routed into Goldman’s dark pool end up being executed inside that pool—rather than out in the wider market.”
That quote, alone, might not be altogether convincing: it's not clear whether he's looking at scenarios in which Goldman's clients have requested execution through the Sigma X, or whether Goldman's clients, requesting best execution, were oddly quite regularly executed through Sigma X, despite the potential for sub-optimal execution through that platform.  It's probably fair to say that those requesting execution through the dark pool would agree that they were foregoing "best execution" in the market - which is something one typically foregoes even with "hidden" orders submitted to an (open) exchange.

But now Goldies is back in the spotlight.  According to today's WSJ, they're considering shutting down their dark pool.

But why?

According to the Journal article, Goldman executives are weighing the benefits of the revenues it produces, against the burdens dealing with trading glitches and negative press.  Some burdens those must be, given Sigma X is purportedly one of the largest bank dark pools, and is likely producing significant flow.

Perhaps there's another theory...

Consider these stories:

In January 2014 Barclays decided to shut down its retail / margin Foreign Exchange business, Barclays Margin FX; in February, NY regulator Lawsky opened a currency markets probe.

In January 2014 Deutsche Bank AG (DBK) announced that it will withdraw from participating in setting gold and silver benchmarks in London;  in March, the CFTC announced that it is looking at issues including whether the setting of prices for gold—and the smaller silver market—is transparent. 

In July 2013, the CFTC put metals warehouses on notice of a possible probe. By November 2013 Goldman was resuming talks to sell its metals warehouses and seeking a buyer for its uranium trading unit; and by March 2014 we had various notices of JP Morgan's intent to sell its physical commodities divisions.

Probe and Sale

We could go on and on, but ultimately these are all anecdotal and we aren't wanting or looking to prove statistical significance at this stage.  We're also not too concerned about what comes first: the probe or the sale.  There's certainly no one-to-one mapping.  Not every regulatory probe is followed by a sale, or vice-versa.  We're only wondering if there's a pattern. And if there's a pattern, could there be an explanation as to why there's a pattern?

Here's one theory.  (We welcome yours.)  Might it be that, pending a likely or imminent (and embarrassing or expensive) enforcement action, banks may take preemptive action in selling "problematic" divisions ... to enable the negotiation of a more lenient settlement as they're (now) less likely to be repeat offenders of whatever activity was the subject of the probe?

In other words, is a dark pool probe pending?

Friday, April 4, 2014

High Frequency (Non) Trading

This week's release of Michael Lewis' new book, Flash Boys, has renewed focus on a little understood area of the market, an area that has garnered the recent attentions of market regulators, New York's Attorney General, and more recently the FBI -- but never as much attention as it garnered from Michael Lewis' interview on 60 Minutes on Sunday, with his book pending release the following day.

Without going into too many specifics, one of the central themes that Lewis discusses is the potential for high frequency traders (or HFTs) to take advantage of certain market information -- like bids and offers -- that are unknown to many other market players.

Defenders of HFTs have come out aggressively, with claims that HFTs increase market activity and liquidity, and have lowered trading costs.  The WSJ published an extensive opinion editorial by hedge fund guru Cliff Asness and his colleague Michael Mendelson of AQR, which energetically claims that much of what HFTs do is "make markets" and that they do it best because "their computers are much cheaper than expensive Wall Street traders, and competition forces them to pass most of the savings on to us investors."

Of course this sounds altogether too convincing.  Unfortunately, Asness and Mendelson provide little or no evidence (although their business as long term traders relies heavily on evidence, and they claim in the article to spend considerable energies looking into their trading costs) and they admit that they actually don't have too much conviction in the premise of their exposition:
"We think it helps us. It seems to have reduced our costs and may enable us to manage more investment dollars. We can't be 100% sure. Maybe something other than HFT is responsible for the reduction in costs we've seen since HFT has risen to prominence, like maybe even our own efforts to improve." (emphasis ours)
But this aside, no doubt all forms of HFTs bring liquidity.  They're a good thing.  Let's focus our attention elsewhere.  

Or not?

Might there be another type of HFT, that doesn't always bring liquidity for the greater good of the market ...  perhaps a type that uses obscure mechanisms to change the look and feel of the market -- to make people think there is a bid, think there is an offer, without there being one?  

This is what Flash Boys, and the interest it has invigorated in HFTs, really concerns itself with -- understanding market maneuvers like spoofing or pinging: the submission of phantom orders, immediately cancellable, that have the potential to create a false impression of market levels.

Are we creating a whole lot of (potentially fictitious) orders, but not a whole lot of activity?  Are there high-frequency non-traders?  Are we mis-marking our portfolios as a result? We continue to investigate.  But we couldn't help but bring you back to a 2013 chart from Mother Jones, which highlights the growing contrast between actual trades (in orange) and quotes/orders (in red).

Monday, March 17, 2014

Government Credit Crisis is Over - So Where are the Ratings Upgrades?

The sovereign and municipal debt crisis of the early 2010s is finished. Overblown predictions of a credit meltdown among European sovereigns, US states and cities, and other advanced economy governments have not been realized. Yes, there have been a few high profile defaults - Greece, Detroit, Harrisburg, Stockton and San Bernardino all come readily to mind because their insolvencies received so much coverage. But many other predicted defaults – Italy, Spain, California, Illinois, San Jose – failed to materialize and the overall default rate among government issuers has been only a few basis points annually. Meredith Whitney’s 2010 forecast of dozens of major municipal defaults is now fully beyond resuscitation – even by Michael Lewis.

Muni bond market shorts set their 2014 hopes on Puerto Rico, but this month’s successful $3.5 billion bond sale makes the odds of a near term default or restructuring remote. Last year, both major pension systems received major overhauls with many current employees taking reduced benefits. Most of Puerto Rico’s debt is long term and annual deficits are relatively low, so the Commonwealth’s intermediate term financing needs are modest. 

The end of the default “wave” and its limited magnitude leave credit rating agencies in an awkward position. Having repeatedly downgraded government credits, their current ratings are inconsistent with those that prevailed at the beginning of the apparent crisis. Also, their government credit ratings are now even more inconsistent with their ratings for corporate and structured – asset classes that have more underlying risk because issuers cannot levy taxes.

In 2013, Fitch announced that it downgraded twice as many US public finance credits as it upgraded in 2013. Moody’s 2013 transition report has yet to appear, but weekly accounts of its upgrades and downgrades at suggest a similar pattern. This preponderance of downgrades is occurring despite the overall improvement in state and local government finance. Renewed economic growth is yielding more income and sales tax revenue, rising home prices are swelling property tax receipts and a buoyant stock market is shrinking unfunded pension liabilities. But because Moody’s decided to use a lower rate of return assumption for pension fund assets, it has created the perception of increased credit risk, despite the absence of such. The blizzard of downgrades has largely offset the (upgrading) effects associated with the 2010 reconfiguration of the municipal ratings scale that had been undertaken in the wake of a lawsuit by Connecticut’s attorney general.

Meanwhile, the high profile states of California and Illinois remain at single-A despite the marked improvement in their prospects. Since Moody’s last downgraded California, the state has swung from deficit to surplus and seen a substantial decrease in its unemployment rate. Illinois, downgraded in mid-2013 due to a temporary failure to pass pension reform, has yet to see a compensatory upgrade now that the reform has been enacted. My own view was that neither state had material default risk in the medium term, given their low debt service requirements relative to projected revenue.

Markets appear to be rejecting the drumbeat of dire rating actions. In the same week that Puerto Rico successfully sold its non-investment grade issue, Chicago placed $884 million in securities on the heels of two Moody’s downgrades (a three notch reduction from Aa3 to A3 in July 2013 followed by a further one notch cut to Baa1 this month).

Perhaps markets have started to ignore ratings because they have become so rudderless. Ratings have inconsistent meanings because they are products of human discretion. If, instead, they were the outcome of stable, empirically-based algorithms, ratings would more likely have the same meaning across time and between categories. Unlike human analysts, computer models don’t have to worry about criticism that they are being soft on politicians or inadequately mindful of unfunded pension liabilities – which are rarely associated with actual bond defaults anyway.

Finally, it is worth noting that inconsistent, incoherent ratings are not merely a sin of US rating agencies. Dagong, which commanded respect for issuing a sub-AAA rating to the US back in 2010, has not covered itself in glory since. After the end of the October 2013 government shutdown it inexplicably downgraded the US rating to A-.

The Chinese rating agency, apparently unaware that partial shutdowns are a familiar part of the US political scene, suggested that the October incident reflected an unprecedented level of risk. Contrary to political and media hyperbole, there was never a serious risk of a Treasury default arising from either a shutdown or a delay in raising the debt ceiling. While I agree that an issuer that engages in kabuki theatre over its credit obligations cannot warrant a top rating, it is absurd to place the world’s most powerful government a few notches above junk amidst declining deficits and accelerating economic growth. Further, we should all take pause from the fact that the Fed has proven capable of buying the lion’s share of new Treasury issuance with freshly printed money and without triggering price inflation.

Dagong’s goal appears to be to convince the world that the US is a worse credit than China. That’s a hard case to make given the latter’s relatively short history as a market participant, its lack of transparency and the risk that its single party political system cannot be sustained over the long term.

But regardless of the ratings themselves, Dagong’s process is disturbingly similar to that of the Western incumbents – discretionary ratings subject to political pressure and human biases. This is unfortunate for a rating agency that hopes to displace the ruling ratings triumvirate. By declining to offer a superior analytical product, Dagong leaves investors little choice but to stay with the incumbents.

Thursday, February 20, 2014

Mortgage Servicers, Underserving?

There's been a lot of news coverage in the last few months on the changing nature of the mortgage servicing industry, and consumer and regulatory difficulties with status quo. (See for example, here and here.) 

Among other things, late last year mortgage servicer Ocwen Financial (OCN) paid roughly $2.2bn to settle claims made against it by the Consumer Financial Protection Bureau (CFPB) that it, according to bureau director Richard Cordray, had "violated federal consumer financial laws at every stage of the mortgage servicing process." 

According to the NY Times, the $2.2bn settlement covers activities from 2009 to 2012 by Ocwen and two companies it recently acquired.

But what about "activities" since 2012?

We did some digging into the number of CFPB complaints being filed by borrowers against mortgage servicers in 2013.  Looking at only those complaints pertaining to (1)    loan servicing, payments, escrow account or (2)    loan modification, collection, foreclosure, we found an increase of over 20% from 2012 to 2013, broken down as follows:

Monday, February 10, 2014

Puerto Rico Rating Downgrades: Enron Redux?

On November 28, 2001 Enron lost its investment grade credit rating. Four days later, the company filed for bankruptcy. Those awaiting a similar collapse after Puerto Rico’s descent into junk bond territory last week will have to wait a lot longer to see the Commonwealth’s financial denouement.

The relatively slow motion nature of Puerto Rico’s fiscal collapse – if, in fact, one is occurring – underscores the differences between various classes of public sector and private sector debt. It also speaks to changes in market conditions.

As with the 2011 S&P downgrade of the US, rating agency actions had little impact on Commonwealth yields. The New York Times reported last Wednesday that the investors had shrugged off the S&P action. On Friday, the Wall Street Journal reported that Puerto Rico General Obligation debt traded at a lower yield after the Moody’s follow-on downgrade than it had earlier in the week.

The limited impact of the ratings downgrades might be attributed to market discounting – since the rating agency actions were widely anticipated. It could also speak to the greatly reduced reputation rating agencies enjoy in the aftermath of the Enron/Worldcom scandals of the early “aughts” and the subprime fiasco of 2008.

Unlike Enron, Puerto Rico can operate for some time without capital markets access. The Commonwealth can get by without financing because its fiscal deficits are relatively low and its debt is predominantly long term. It thus does not need that much new cash to finance ongoing operations or to roll over previous bond issues.

But, sooner or later, Puerto Rico will have to bring new issues to market, and many doubt whether investors will be around when it does. Commonwealth-related debt accounts for about 2% of overall US municipal bonds outstanding and its fall from investment grade leaves many traditional investors out of the running. So it would appear that there is a lot of debt and not much appetite.

In my view, this analysis misses some key institutional developments. Hedge funds and certain other classes of investors can traverse multiple markets. Further, Asian investors have accumulated billions of savings and remain on the lookout for alternatives to low yielding US Treasuries. So the constituency for Puerto Rico debt is not merely the $3.7 trillion municipal market, but a much larger audience especially if the price is right. 

Puerto Rico debt is now trading at yields much higher than that of Italy, Spain and Portugal - and is roughly on a par with Greece. In contrast to Greece, Puerto Rico is not a serial defaulter. In fact, it is part of an asset class – US state and territorial bonds – that has not seen a default in over 80 years. Further, the last default – of Arkansas in 1933 – ended in a full recovery for investors. So, from an international perspective, Puerto Rico bonds appear to offer good relative value.

Thus if new Puerto Rico bonds are offered at 8% or 9%, I expect that they will find a bid. While coupons at that level are not fiscally sustainable, the fact that most Puerto Rico government debt is long dated means that Commonwealth interest expenses as a proportion of revenue will remain low relative to previous default cases.

Unlike Enron or another private company, a US sub sovereign like Puerto Rico has secure revenue sources in the form of taxes and federal assistance. As Detroit has shown, insolvency is ultimately possible, but the path to ruin for a public sector debt issuer is usually a long one.

Notes: I purchased a small number of Puerto Rico GO bonds last month. Any opinions provided herein are my own. PF2 is an independent third party and does not provide investment advice. For my previous commentary on Puerto Rico's lack of fiscal transparency, click here.

Friday, January 3, 2014

TruPS CDOs - Still Underrated?

Hello readers, and welcome to 2014!

In our first post of the year, we're going to continue with a theme we've been commenting on for years: we're noticing that many TruPS CDOs languish at deflated ratings levels, not having been adequately attended to by the rating agencies since their 2009 downgrades.  

Consider for example this first pay, amortizing bond (CUSIP 903329AB6) from one deal (US Capital Funding I) -- which has strengthened dramatically since its original Aaa/AAA ratings were provided back in 2004. It now has much more than 100% principal cushion! Yet, oddly enough, the ratings remain much lower to this date.  S&P still has this in deep junk territory, at B+, albeit on "watch" for upgrade.

Let us know if you would like to join the call for appropriate upgrades in 2014!


Friday, December 20, 2013

Richmond's Eminent Domain Program: It Could Really Happen

Richmond, California’s plan to use eminent domain to free city homeowners from underwater mortgages took a number of steps forward recently. Until this month, I would have given very long odds against Richmond actually going ahead with the plan, but now I would place the chances at close to 50/50.

The first development occurred last week in Washington with the Senate’s confirmation of Melvin Watt as Director of the Federal Home Finance Agency – which regulates Freddie Mac and Fannie Mae.  Under Acting Director Ed DeMarco, the agency had suggested it might prevent GSE financing in Richmond if the city moved forward with its plan to seize properties under the guise of eminent domain. The threat of not being able to get a Fannie or Freddie insured mortgage would have inflamed local opposition to the program. But, given Watt’s more progressive orientation, I doubt that he will continue DeMarco’s resistance. I suspect Watt is much more likely to look at the situation through the “little people vs. big banks” lens than his predecessor – whose career was devoted to the health of the home financing market.

The other developments occurred Tuesday night at the City Council meeting, the video of which is available here (most of the relevant discussion can be found between the 3 hour and 6 hour marks).  Mayor Gayle McLaughlin proposed and won passage of a motion that fine-tuned the program in a couple of important ways.

First, her measure imposed guidelines limiting the eminent domain program to mortgages below the conforming loan limit (now $729,750) and in struggling neighborhoods. This change addresses a revelation first made here at ExpectedLoss and later picked up in a WSJ blog and by the San Francisco Chronicle: that the program would have benefitted owners of some very expensive properties in affluent neighborhoods.

Second, the McLaughlin proposal calls for the eminent domain power to be exercised by a “Joint Powers Authority” rather than by the city itself. As reported by the Chronicle’s Carolyn Said, this change allows the eminent domain action to be approved by only a simple majority vote – rather than a super-majority as previously required. There appear to now be three solid “no” votes out of the seven officials who vote on the Council, so the need for a super majority appeared to be a deal breaker.  McLaughlin should be able to hold onto the four votes necessary to create the JPA and implement the eminent domain program. 

But the JPA device imposes a new challenge: another city would have to agree to participate in the JPA. Although McLaughlin and her supporters listed a number of potential partners in California and elsewhere, none of these cities have gone as far down the eminent domain path as has Richmond. Indeed, it is possible that none of these cities will ever get beyond the talking stage. This assessment applies especially to San Francisco – a city whose skyrocketing home prices have left few mortgages underwater.

A more likely candidate, El Monte, will need to be careful. The city declared a fiscal emergency in 2012 and some of its bonds carry non-investment grade ratings. Richmond has already been punished by the municipal bond market despite its superior fundamentals; it’s hard to see how a lesser credit like El Monte will attract investors if it goes ahead with eminent domain.

So the need to get a partner is a significant barrier – but not an insurmountable one. Undoubtedly, the ambitious folks at Mortgage Resolution Partners are very hard at work finding Richmond a mate. Of course, the path to finally condemning mortgages leads through the courthouse. Whether Richmond can prevail, with a very unusual interpretation of the takings clause, against a battalion of well-financed Wall Street lawyers is another bet entirely.

Thursday, December 19, 2013

Good Intentions are Not Enough: The Problem of SEC Mandated XBRL Reporting

Public companies have been required to supply financial reports since the Depression, but gathering and analyzing this disclosure has had its challenges. In the 1990s, the SEC began uploading 10-K’s and 10-Q’s to the internet, greatly simplifying the data collection task. These electronic reports were not standardized, creating the need for downstream users to write complex parsing algorithms and/or use manual processes to harvest the financial statements.

In the late 1990s, accounting and technology firms devised a standard called XBRL – eXtensible Business Reporting Language – to streamline the data acquisition process. XBRL disclosures rely on a common system of tags that consistently identify financial statement elements. The universe of elements differ amongst accounting standards, such as US Generally Accepted Accounting Principles (US-GAAP) and International Financial Reporting Standard (IFRS). An XBRL taxonomy lists all the acceptable financial statement elements for a given accounting standards.

Beginning in 2009, the SEC started requiring public companies to file 10-K and 10-Q disclosures in XBRL using a US-GAAP taxonomy – maintained by the accounting community and approved each year by the SEC.

Recently, I worked with UK-based OpenCorporates to gather SEC XBRL disclosures and harvest data from them. The goal was fairly simple: walk through all the XBRL documents and gather some basic parent company data points (like total assets, total liabilities, total revenue and net income) for the latest fiscal year from these disclosures.

This task proved surprisingly difficult because of a lack of standardization between XBRL documents from different companies. For example, many companies did not report a value for Total Liabilities. One might “back into” this value by subtracting Shareholders’ Equity from Total Assets, but this doesn’t always work. A small percentage of XBRL reports even lacked a Total Assets field. On the income statement side, the dispersion was even greater, with Total Revenue, Operating Income and Net Income often unavailable.

Finding data for the latest period also proved challenging. XBRL files can contain numerous contexts. Each context refers to a reporting period (e.g., a particular quarter or year) and a scope – which may be the parent company or a particular segment of the corporation (e.g., a subsidiary). Contexts contain period elements and an optional segment element indicating which timeframe and what scope the context covers. To find the latest year’s parent company data, it is necessary to develop a program to walk through each context.

These examples suggest that processing SEC mandated XBRL disclosures is less than straightforward. Indeed, the industry group XBRL.US reports finding 1.4 million errors in the universe of XBRL documents filed thus far.

A recent letter from Darrel Issa (R-CA) to the SEC notes that the agency itself is not using the XBRL files it requires corporations to file. Instead, it continues to rely on commercial data aggregators. Electronic disclosure won’t improve unless numerous eyes are scrutinizing it and reporting issues. Data sets need to be exercised; otherwise they remain unfit.

The lack of XBRL utilization represents a major threat for transparency advocates. If we ask for more accessible disclosures and then don’t use them, filers can be expected to push back. In the case of SEC XBRL, the filings are sufficiently complex to require the use of third party XBRL submission firms. In other words, it is too difficult for most companies to prepare XBRL submissions themselves – they need to use an independent preparer, just as individuals often need to hire professionals to file their annual tax returns. Corporations would undoubtedly like to economize on this cost, and can be expected to resist the XBRL reporting requirement if the filings are not used.

From my perspective, a big problem with the XBRL rollout is that it started with large public companies. By 2009, many data aggregators already had mature processes for assimilating the traditional SEC disclosure. As a result, fielded public company financial data has become a commodity; individuals can access these data for free at Yahoo Finance and many other portals. The incentive for aggregators to use XBRL is thus limited because the problem has already been solved at some level, and because the data are widely available, there is little benefit to potential new entrants.

XBRL can provide much greater benefits for data sets that have not received as much attention. I became interested in XBRL back in 2001 because I was hoping to get a standard source of private company data at my bank. The idea was to provide unlisted corporate borrowers with an XBRL template to provide quarterly disclosure.

Another high impact application for XBRL is state and local government financial reporting. When XBRL was growing up in the 1990s and 2000s, US municipal bonds were generally perceived to be safe. That perception started to change in 2008 with Vallejo’s bankruptcy filing and the collapse of the municipal bond insurance industry. Subsequent municipal bankruptcies culminating with that of Detroit in 2013, have reinforced the perception that municipal securities are risky. Government financial statements, which may have been ignored previously, now have significance as investors search for the next bankruptcy candidates.

However, the rollout of XBRL to other areas – such as local government – may now depend on its successful implementation in existing areas: especially the high profile SEC US public company application. If the SEC is unwilling or unable to engage, the community would be well served by collaborating to implement its own improvements. Although XBRL filing companies compete with one another, they all have an interest in the success of the XBRL standard. Thus, as an industry group, these companies can propose and implement improvements to SEC XBRL filings that will make them easier to use. For example, they can develop an enhanced XML Schema Definition which provides additional checks over and above those legally mandated. Such a schema should ensure that filers always include common financial statements such as Total Assets and Total Liabilities. It should also ensure that, within any given XBRL file, the latest period’s parent company filing is easily identified.

XBRL was and remains a good idea. Transparency advocates need to ensure that it does not become an idea whose time has come and gone. To keep XBRL on track, its public company instance needs to be refined so that implementation costs are reduced. Further, it needs to be applied to other areas – such as US local governments – which stand to gain greater benefits from its adoption.