The Department of Broadband has railed against an audit of the Australian Broadband Guarantee scheme that concluded it had not kept pace with the prices and download quotas offered by metro-comparable internet services.
In an at-times scathing review [pdf] of the Department's administration of the scheme, the Australian National Audit Office (ANAO) criticised the DBCDE over the lack of hard evidence and analysis used to inform decisions about the scheme.
ANAO found on numerous occasions that the DBCDE was unable to produce adequate support documentation for the advice it provided to Communications Minister Stephen Conroy, which went on to shape decisions about minimum internet speeds and download quotas.
"It is difficult to see how the department conducted these reviews and the underpinning rationale for the recommendations put to the Minister to change (or not change) various elements of the program," the report stated.
"The review process does not reflect an evidence‐based approach to the provision of policy advice."
Entry-level plans frozen in time
Internet service providers (ISPs) that participated in the ABG scheme were required to provide at least one 'threshold' and one 'value-added' service.
The 'threshold' service required a minimum download/upload speed of 512/128 Kbps and 3GB of data each month.
However, ISPs were also permitted under the ABG to "offer cheaper, lower capacity, entry level services".
These were set at a minimum standard of 256/64 Kbps with a monthly quota of 500 MB.
About 77 percent of all connections made under the ABG were on entry-level plans. Only 21 percent of customers opted for the higher 'threshold' level plans, the report said.
Yet the DBCDE never once revised the entry-level minimum standards over the life of the ABG program, the audit office found.
The Department also only made major revisions to the 'threshold' minimum standard in May last year, the report stated.
That led ANAO to conclude that "on average, prices paid by ABG customers, while lower than would have been paid without the subsidy, have exceeded the prices paid for equivalent broadband services (in terms of speed and data allowances) in metropolitan areas."
ANAO was critical of an apparent lack of evidence used by the DBCDE to shape policy recommendations made to the Minister regarding the ABG.
On several occasions, the DBCDE was unable to turn over adequate documentation to auditors that showed why decisions affecting the ABG were taken.
While auditors acknowledged that policy settings for the ABG program were "matters for the Australian Government to determine, based on advice from its department and any other sources", they issued a sharp rebuke to department officials over the way they compiled evidence for reviews undertaken in May 2008, February 2009, August 2009 and February 2010.
"The department provided various discussion papers and spreadsheets that it used to assist in the preparation of the program guidelines in relation to commercial metro‐comparable service levels," the report stated.
"However, these documents did not include conclusions or recommendations drawn from any analysis of the market and ABG data, or comparisons with services commonly taken up.
"It was not clear if any other factors were considered and how these internal working documents informed the advice provided to the Minister in relation to proposals to change or retain existing program arrangements.
"The department was also unable to provide documentation to support its review of subsidy rates.
The May 2008 review looked at whether to double the threshold service speeds to 1024/256 Kbps.
The review concluded that such a change would cause problems for participating ISPs and become a drain on program funds.
But the department was unable to produce any formal analysis to auditors that showed how it reached this conclusion.
"DBCDE provided 'an example of its modelling', which comprised a spreadsheet with raw data from a limited sample of commercial and ABG broadband service plans," the audit report found.
"There was no analysis or modelling evident in the spreadsheet and its relationship to the advice provided to the Minister was unclear."
Further reviews undertaken in 2009 produced similarly puzzling outcomes.
"The department's August 2009 analysis indicated that, for the price bracket closest to the metro‐comparable cap ($69.44 per month), the average metropolitan speed was 17 times faster and the average monthly download allowance was seven times greater than for the ABG threshold service," auditors said.
"[But] advice to the Minister in relation to the 2009‐10 guidelines proposed that only minor changes be made and did not specifically address whether the service standards had been reviewed."
The DBCDE strongly criticised the ANAO's findings.
"To adopt average available metropolitan service and data allowance levels as a benchmark is the ANAO re‐writing the policy, a matter in our view for Government," it responded.
"The Department (DBCDE) believes elements of [this finding] suffer to a degree from a misapprehension about the program, in relation to ANAO expectations of continuous improvement in the provision by the ABG of metro-comparable services."
Had the subsidy kept pace with metropolitan rates, DBCDE argues it could have deterred the introduction of commercial ISPs to services these rural and remote areas.
"Determining what service is subsidised will – ... as in this case where the market is the dominant provider – provide disincentives for the expansion of commercially‐provided services if not carefully targeted."
The audit report also observed that the DBCDE changed its key performance indicators annually.
A different set of key performance indicators (KPIs) has been used for each year of the program, making it difficult to track the program's performance over time.
For its part, DBCDE acknowledged that frequent changes to the program's KPIs were not desirable.
More generally the report takes DBCDE to task for failing to account its performance against those indicators.
"The department has not reported against the program's key performance indicators and performance targets outlined in its Portfolio Budget Statements, whether program objectives have been achieved and what outcomes can be attributed to the program's intervention," the audit found.
"Performance reporting has largely been activity¬based and does not include key program elements, their results and impacts, or trends over time.
"This type of information would give greater transparency to the operation of the program, better inform management and policy decision-making, and provide context about the environment in which the program is operating."
Copyright © iTnews.com.au . All rights reserved.
Processing registration... Please wait.
This process can take up to a minute to complete.
A confirmation email has been sent to your email address - SUPPLIED GOES EMAIL HERE. Please click on the link in the email to verify your email address. You need to verify your email before you can start posting.
If you do not receive your confirmation email within the next few minutes, it may be because the email has been captured by a junk mail filter. Please ensure you add the domain @itnews.com.au to your white-listed senders.