Remedial learning required

Local authorities, health and education need access to bandwidth like they need access to a building. You need different pricing models and the telco world has been slow to recognise that. Laurence Zwimpfer, 20/20 Communications Trust, 2007

Technology in health and education has evolved haphazardly under ill-informed policies that allowed public good sectors to evolve their own networks and disparate IT systems in a competitive funding environment without properly considering how everything might ultimately tie together.

Billions of dollars have been invested in proprietary approaches to support our physical and intellectual well-being, but the logic of achieving efficiency and cost savings by sharing common data across a single high-speed network has only recently started to register at ministerial and boardroom level.

Huge savings were possible through shared services across the health system, for example, eliminating paperwork and duplication of effort. People in outlying areas could be remotely diagnosed by specialists from major hospitals, surgeons could consult by teleconferencing, X-rays could be sent instantly and waiting lists reduced dramatically. The same principles, if applied to education, would have experts locally and across the world collaborating and sharing their wisdom and insights with classrooms of young people using interactive video-conferences as part of an on-line curriculum where lessons were updated based on the student’s pace of learning.

There are only a couple of vital areas where government involvement is imperative in a society that values its people and plans for the future. Good government should provide and maintain essential infrastructure, foster justice, equality and open participation in decision making, protect our borders and provide the best possible health and education systems taxes can afford.

In August 2007 it was clear that something was very wrong with our health system. Emergency rooms across the country were full to overflowing, with people on trolleys lining corridors waiting to be admitted to already overcrowded wards. Even ambulances were backed up waiting to discharge patients at some hospitals and unable to answer further call-outs while they were in the processing queue. There were horror stories about the severe pain people were in, and the humiliation they suffered waiting long hours for treatment.

The government solution to the growing surgery waiting lists of people already pre-selected for further care by their GPs had been to regularly cull the lists so the numbers didn’t look as bad. Hospitals were crying out for more nurses and doctors, who were in turn crying out that they were overworked and underpaid. Meanwhile the hospital boards who administered the funding were being forced to reduce their budgets and operate with a profit motive rather than as public good agencies there to heal, restore, and reassure.

Emergency squeeze

Many emergency departments in New Zealand were already operating in disaster mode and while it was still thought they could cope in a real disaster, it would be ‘majorly challenging,’ said Wellington Hospital emergency department clinical leader Peter Freeman. He was speaking as one of two New Zealand representatives on the Australasian College for Emergency Medicine which had published a survey of Australian hospitals. It found caring for patients who could not be sent to overcrowded wards soaked up about 40 percent of emergency department workloads.

He told the Dominion Post that Wellington Hospital staff regularly had to open concertina wall dividers – designed to expand the department for mass casualties in natural disasters – to accommodate more patients. On 5 August 2007, Wellington Hospital’s emergency department had 41 patients for its 21 cubicles; seven patients were waiting for ward beds, one had been there for ten hours, and 15 were waiting to be seen. Auckland’s North Shore Hospital was so overcrowded that ambulances were stuck in a holding pattern waiting for patients to be admitted.

Dr Freeman said hospitals nationwide were ‘squeezed’ so much that some patients spent days in the emergency department, never making it to the wards. District health boards needed to make patient flow and bed management a core focus. Health Ministry Principal Medical Adviser David Galler, a Middlemore Hospital intensive care specialist, said emergency department overcrowding was well recognised. This could be eased by better systems to improve patient flow to wards and predicting high demand. The quality improvement committee, which provides advice to Health Minister Pete Hodgson, would make a recommendation on patient flow, he said.[1]

The hospital was short of 60 full-time nurses while its tower block wards were running at near 100 percent full. Because of the way funding is calculated, Waitemata DHB, which has the healthiest and wealthiest constituents in the country, is the lowest-funded district health board in New Zealand.[2] Staff shortages, dissatisfaction with cumbersome systems, inefficient management, low wages, and stressful working conditions have taken their toll on the skills base within the health system for years.

National Health spokesman Tony Ryall said in August 2007 that after more than 40 health workforce reports over eight years there had still been no solution to stemming the downgrading of the workforce of enrolled nurses.

Meanwhile the loss of doctors overseas, population growth, an aging population, and changes to the workforce were all contributing to the increased need for hundreds of new surgeons over the next 20 years, according to the College of Surgeons. The Royal Australasian College of Surgeons predicted surgery volumes would rise by 51 percent overall by 2026. It takes at least 13 years to train a surgeon – 6 years in medical school and a further 7 or 8 to complete their surgical education and training.[3]

Elder abuse

Then three years after the fact Auckland Hospital accepted blame for a chain of errors that led to the death of Mervin McAlpine, 82, who died after being given the wrong drugs. He was a diabetic, who had already lost both his legs, when his GP sent him to hospital because he thought he wasn’t coping at home. At the hospital, however, nobody asked him what medicines he was taking. His GP had faxed a one-page referral at roughly the same time another patient was being referred also by fax. The two fax sheets were mistakenly stapled together and passed on to the house surgeon who gave McAlpine the wrong drugs, which caused his body to shut down. He died days later.

“We’re accepting full responsibility for the failure to reconcile the medications at the end of the day,” said Dr David Sage of the Auckland District Health Board. He said confusion was caused because the software used in GP referrals did not print a patient ID on faxes. The system had since been changed, however, he admitted while a lot more checks and balances had been put in place, the system was still not fail-safe. A national approach and a single medication list accessible to everyone would be a further improvement.

A report from the Health and Disability Commissioner pointed the blame squarely at Auckland Hospital and called for a nationwide response to develop a co-ordinated and consistent approach to make sure people got the right drugs. An audit at Auckland Hospital investigating the treatment of elderly people taking five or more medicines from December to March 2007 found 70 percent of them had more than one medicine listed incorrectly. Medical Association chairman Peter Foley said the medical workforce was under tremendous pressure and a smooth electronic interface was needed between GPs and their specialist colleagues, with more cross-checking.[4]

The fax farce only added to the pressure to move to widespread electronic communication of medical information. It was a political hot potato with National’s Tony Ryall adding his voice to the blame game in Parliament in August 2007.

A plan to barcode patients, medicines, and staff at public hospitals to reduce the risk of medical misadventure looked as though it would proceed. After all the government had set aside $10.2 million in the 2006 Budget to ensure there would be no funding hold-ups if the initiative went ahead, though the final cost of installing scanners and developing the back-end systems for hospital wards wasn’t known. Based on the results of overseas studies, it had been estimated about 150 people a year might die in New Zealand each year as a result of prescribing errors. Between 1990 and 2006, 286 deaths in hospitals were linked to such errors, according to records filed by district health boards.

Health Minister Hodgson was urging the wider adoption of electronically readable barcodes although he admitted barcoding could not be forced on hospitals. The plan was to give patients a barcoded wristband linked to their medical record when they entered hospital. Before medication was administered, a staff member would scan the patient’s barcode and a barcode on their own staff identification tag and on the medication provided by the hospital pharmacy. This would ensure the right patient was getting the right drug at the right dose.[5]

Big picture strategy

Guiding the way for health information system reforms was the August 2005 Health Information Strategy (HIS-NZ), which borrowed heavily from objectives first outlined in the 1996 strategy and the recommendations of the 2001 ‘Wave’ report published by the Ministry of Health, which listed 79 ways to improve IT use within the health care sector. The revised HIS-NZ described a vision for vastly improved health information by 2008–2009. Community providers would be connected to a secure health information network, enabling seamless transfers of patients from one hospital or care provider to another. This would ensure information about medication, continued treatments and rehabilitation was more accurately and reliably relayed.

The five-year Health Strategy sought to close the gaps in existing data collections particularly in regard to community, primary care and outpatient services. It envisioned all health care providers having access to secure email, doctors being able to track and monitor the dispensing of prescriptions and laboratory tests, and ideally electronic prescribing between doctors and pharmacies. Electronic discharges would be ‘ubiquitous,’ so GPs and residential care providers always had current information about patients once they’d been discharged.

The preferred outcome was to replace the largely paper-based processes by which GPs ordered specialist consultations with a system allowing for electronic referrals. There was also a requirement that clinical information recorded when people received primary care providers or visited hospitals as outpatients could be more easily accessed, particularly by GPs. While work on each of these areas was ongoing it took two years after the strategy document was published for a business plan to articulate the detailed steps forward.

Several reviews of progress had been made along the way including concerns raised by the Auditor General in March 2006 and a report from management consultant and former HIS-NZ general manager Ray Delany. He pointed out that the author’s of the Wave report hadn’t examined the reasons why the proposed information strategies of the past were not implemented other than to suggest it largely stemmed from progressive under-funding. He said the report’s writers seemed to be aware of potential problems when they said if recommendations weren’t acted on rapidly, “the sector [would] be writing another large-scale sector information strategy in five years time.”[6]

Delany said the Wave report reiterated issues outlined in earlier strategy documents, and noted the health sector was ‘awash in strategy.’ Then five years after its publication, although some progress had been made, the Office of the Auditor General expressed concern that this had been slower than anticipated and much work still remained to be done to fully implement the ‘Wave’ recommendations.[7] Delany said in June 2006, the New Zealand health system had been attempting to bridge gaps in information availability for years:

Delany said researchers, clinicians, administrators, and educationalists must work together with skilled health information managers to achieve the best outcomes. “Experience indicates that where this multi-dimensional approach is used, the benefits are considerable. Many existing data sets are very useful sources of data that are currently underutilised. At the national level, New Zealand holds five years of mental health contact data, nearly eight years of laboratory tests, 14 years of pharmaceutical dispensing data, 20 years of hospital discharge information, 30-plus years of mortality data and over 50 years of cancer diagnoses. There are robust technology and governance mechanisms for protecting individual privacy while allowing analysis of these data to the most sophisticated degree.

Few other countries in the developed world can boast as much. It is a national treasure and an epidemiologist’s dream,” said Delany.

Evolving health network

At the heart of the HIS-NZ and the flow of information around the health sector is the New Zealand Health Network (HealthNet), effectively a health intranet[8] linking hospitals and health care providers and embracing Telecom’s SecureMe network for GPs and District Health Boards, and HealthLink’s networks connecting GPs, pharmacies, laboratories, and radiologists. Following the Auditor General’s report in March 2006 the Health Information Strategy Acton Committee (HISAC) streamlined and simplified the registration process for HealthNet, engaged with HealthLink and Telecom and was making it easier for health suppliers to understand the various network options. It had also accredited TelstraClear as a third provider and was working on seamless interconnection.

Key to the flow of data was the National Health Index (NHI), which assigns a unique code to each patient, linking clinical records. This had been continually enhanced over several years and was considered a world-leading breakthrough. To complement this the Health Strategy goal was for a single set of definitions for clinical and administrative terms between computer systems and that all 1500 GPs would have broadband access to data. Much of this was already in place before the strategy document was released. Work on an electronic Health Practitioner Index (HPI), a database that assigned a unique code to each clinician and administrator, had been underway since 2004 and was expected to be completed before the end of 2007.

Only 400 organisations were connected to HealthNet at the time of the report but this had grown to 1000 by late 2007 and was expected to reach at least 3000. HIS-NZ required that more information be made available electronically to GPs and that had been increasing steadily over the decade, especially relating to hospital visits. All GPs were using patient management systems and hooked up to HealthNet in some way, the vast majority on broadband. More than three million messages were sent from hospitals to GPs in 2007 with 18 out of 21 District Health Boards (DHBs) involved. Network provider HealthLink, for example, was providing connectivity for 3500 organisations including hospitals, laboratories, and radiologists with 40 million items of electronic information flowing around the health sector annually.

Following his contract as head of the Digital Strategy secretariat Peter Macaulay began work with the Ministry of Health on a three-month project to try to aggregate broadband use within the health sector to leverage a better deal from telecommunications carriers and to encourage investors to create networks where they were most needed. While the thinking in 2006 was around the creation of a single next generation health network (NGHN) the acronym soon became known as the ‘not going to happen network.’

The Ministry of Health realised it didn’t have the many millions of dollars required so Macaulay started work on a much simpler model. He began expanding on a 2004 interoperability model that required vendors to agree to work together and embrace common standards and guidelines; in this case specialised health sector security and the use of virtual private networking (VPN) where there was a need to transfer sensitive information.

If all the health applications and networks complied with the requirements they could be certified with the ConnectedHealth brand and given the green light to transfer health records and market their services to the 15,000 health providers around the country. This full commercial model would require no expenditure on the part of the Ministry of Health other than the cost of a small group to manage standards, branding, and some of the sales relationships.

While District Health Boards had their own views about how things should be done Macaulay said they had all signed up to ConnectedHealth in February 2007 and he had hoped by September or October to have the vendor interoperability model in place. “My aim was to have systems in place so vendors come up with enough high-speed wireless and fibre connectivity to considerably reduce the cost to health providers.” If the three networks already under the umbrella of HealthNet used consistent standards they would already form the core of the ConnectedHealth model. The three-layered approach to connectivity; physical, logical and application, would mean everyone could communicate with everyone else on all layers. “HealthLink for example has an expensive but highly effective service but they needed to ensure they could connect to all other parts of the network. They were happy with that, but actually achieving it was the next stage,” Macaulay said.

Looking for direction

The ConnectedHealth network could easily work in with the GSN and the proposed education network in order to aggregate demand. “If there was no high-speed connection into Gisborne for example you might be able to guarantee $600,000 to deliver health, education and government services. That should be enough incentive for someone to roll out fibre and be accredited as part of the ConnectedHealth network.” While the GSN had already addressed connectivity and security at a high level, a third of all health providers were commercial suppliers so they couldn’t use that network but it was thought there would be other opportunities for them to aggregate revenue streams. One of the major partners in the ongoing development was network giant Cisco.

The health sector would join education in ‘leaning on the telcos’ to provide New Zealand with decent broadband, said Brendan Kelly, chief advisor of HIS-NZ within the Ministry of Health. He told the Wellington Government Information Services conference in May that health was joining forces with education because broadband services were essential to both sectors. In particular, it put scarce specialist resources in touch with health service users nationwide. He didn’t think it was unreasonable to have every GP on 10Mbit/sec and every hospital on 1Gbit/sec within the next five years. He believed the move to collaborate with the education sectors made sense as the two ministries performed two of the biggest public sector functions and most schools had a GP or other health provider in their immediate neighbourhood. The Ministry’s vision of health saw it evolving from a sector to a system, where all the elements were connected, he said.[9]

While everyone was now pointed in the same direction, Macaulay said it was still taking too long to get things done. The health sector had been working on a Privacy, Authentication, and Security Project (PAS) for a number of years. The Auditor General had commented on its slow progress although it was been claimed the arrival of new technologies including mobile and wireless computing required the framework to be revisited.

“The big issue with health is it seems to think technology has to behave differently for them than it does for everyone else, and that’s nonsense. The issues are mainly around safety and privacy. The safety issue is no different from the safety and security of financial transactions that are done every day and the danger with privacy is that some health professionals seem to have gone so overboard nothing is good enough.” He said those within the health sector willing to take risks were achieving huge things but it took so long to get through to those who were still locked into privacy and safety concerns that it was preventing innovation.

He said many of the existing hospital systems were antiquated, ‘highly safe’ or proprietary with no consideration for standards or how they might work with other systems. While Middlemore, Dunedin, and a number of other hospitals already had internal gigabit networks for moving large files like X-rays around, ConnectedHealth was focused on getting standards, and vendor arrangements in place for wide area connectivity between hospitals and out to GPs.

“We were ready to push ahead with the brand, standards, sales exercise and concurrent to that take feedback from across the health community on developing the new capabilities but we wanted to get a real live production network up and running first.” At the close of his contract he was about to sign off on a supplier model but found the ground kept shifting. He tendered again to pick up on the next phase but the Ministry of Health opted instead to go back and consult with the wider health community on its perceived requirements. Macaulay insists there were no technological obstacles to the approach he had been championing, which was as applicable in health as it was in other sectors. “The technology is available, the obstacles are commercial and relate to a lack of investment, a lack of fibre infrastructure to achieve the connectivity and a mental block in terms of who pays for this.”

As it was, he said, the Ministry of Health was ‘spending money like you wouldn’t believe’ on a whole range of systems and processes. Running the existing HealthNet alone cost half a billion dollars a year. “If the technology was used properly it could improve the way the health system worked across all areas including ICT, business, and other systems. If they got their connectivity sorted out it could facilitate huge changes in the way people work across any of these areas and consequently bring huge savings. If someone is bought into A&E and is in no state to tell you what is going on you should have access to their health records to identify not to give them penicillin. At the moment there’s no way to do that.”

Recycling the forms

Dougal McKechnie, office manager for the independent HISAC secretariat, said one of the major issues was getting all 21 district health boards and other health providers to standardise their computer systems and software so everyone could share data. There were several relatively incompatible hospital management systems and five GP patient management systems in use across the country. He saw HISAC’s role as championing new applications that complied with standards and looking at areas where applications still needed to be developed. “Some zones are more extensive than others, for example the four transactional zones around e-referrals, e-discharges and e-pharmacy are complicated and involve a diverse range of stakeholders.”

Duplication of effort and incompatible systems across health boards and even inside hospitals themselves, however, remained a major obstacle, trapping valuable information in niche areas. Orion Systems[10] International chief executive Ian McRae said it was not uncommon for a medical facility the size of Auckland Hospital to have 200–300 departmental databases for a variety of useful research and other purposes that were not connected in any form. That meant information that could be valuable in the treatment of patients was unable to be shared.

After a six-month collaboration with Auckland Area Health Board in 2006, Orion came up with a ‘genericised’ Web-based clinical database solution, which opened up access to data from disparate sources. At last data could be stored in a consistent way so all departments had access. McRae said this would be particularly useful for hospitals that had, or were in the process of creating, major electronic records solutions for lab results, referrals, and discharges that had not previously addressed smaller departmental requirements. “We want doctors to make the right decisions about health, the medication people are on and historical lab tests so we get a far better picture of our health.” The new system, based on the needs of hospitals across the country, was being trialled in mid-2006 ahead of worldwide release to meet pent-up demand in hospitals around the globe in late 2007.[11]

General practitioners typically received some form of notification when their patients left hospital. Most were doing this but in an unstructured form and it was often just a letter or email. Most referrals, confirmation, and subsequent communication concerning the patient were sent by post. Eight DHBs were receiving electronic referrals from GPs in 2007 but this was not standardised. To resolve the problems Hutt Hospital, in conjunction with three vendors, developed an electronic referral application for GPs referring patients to hospitals.

A good example of progress was the new forms-based eReferral application being rolled out across 30 general practices in the Hutt Hospital region. When a patient was sent for secondary care the GP simply pulled up an electronic form which was pre-populated in real time from the Medtech patient management system database, including current medication. Lab tests and any other data could be attached. The form was sent over the HealthLink secure health network, and once synchronised with the hospital’s electronic patient records, a receipt was sent back to the GP. That messaging trail was maintained as the case was evaluated, prioritised, and acted on so everyone was kept in the loop. At each stage the specialist could write comments in the various fields and share these with the GP.

Tony Cooke, chief information officer at Hutt Hospital said there were 29 variations on the forms ranging from orthopaedic services to speech-therapy requirements. “GPs like it because it saves them typing in the data and specialists like it because they get a complete set of information.” The response from the coal face was that it was quicker, more reliable and made life easier. “The main thing is that referrals don’t fall between the cracks which can have serious repercussions when things need to be addressed quickly.” Potentially the new application could be used for internal hospital, GP to GP or referrals to physiotherapists or other community providers.

Accolades for e-health

HealthLink chief executive Tom Bowden said New Zealand was now recognised as being world-leading – just behind Denmark – in terms of e-health and primary care communications. “We have the best communications between hospitals, specialists and GPs in the world. Our job is connecting GPs to everyone else in the health sector and our counterpart in Denmark says having everything automated is saving doctors 50 minutes a day.”

HIS-NZ recommended health care providers track and monitor the dispensing of prescriptions and laboratory tests, with electronic prescribing between doctors and pharmacies. HealthLink had been working on a solution for pharmacies, similar to eReferrals. “We have a couple of hundred pharmacies connected to our network mainly for electronic claiming, and they are well positioned to go to the next stage.” The secure email-based e-pharmacy application had been trialled in a closed pilot situation and couldn’t be used in a wider context unless approval was given by the Director General of Health with exemptions under the Electronic Transactions Act, which had been applied for.

The Otago District Health Board received the go-ahead from the Ministry of Health in mid-August to begin testing electronic prescribing in 2008. Three boards in Auckland were investigating the way forward and had introduced computer systems that allowed hospital doctors to access virtually all of the state-funded laboratory tests results for their patients. Health Ministry manager of governance for district health board funding, Bruce Anderson, talked about a plan to spend $114 million over 12 years on new equipment and systems to improve drug administration and patient safety, including the proposed barcode system as part of a ‘complex Web of computer changes.’ He said electronic prescribing could save 1050 lives over that period, prevent 32,000 cases of disability, and save $115 million by reducing the additional time patients would have to remain in hospital through drug errors.[12]

Another example of how things could be streamlined came through the decision by Pharmac to develop its own on-line solution. The government medicines acquisitions and management agency requires doctors to have specific permission to prescribe certain types of medications. However until the end of 2006 its paper-based process for approval could take up to 20 days. That wasn’t helpful if patients needed medication urgently. Now its on-line system, available through HealthNet, allowed practitioners to get almost instant approval.

Around the country secure, Internet-based solutions were starting to make life a lot easier, especially for those in regions where leading-edge innovations were being deployed. Doctors travelling throughout the sparsely populated West Coast district of the South Island were now able to take all their records and patient details and interact with other health centres and professionals, even in the smallest town. Hospital X-rays could be viewed across the region and blood test results given to patients within ten minutes of their analysis. The breakthrough was enabled through the West Coast District Health Board’s Integrated Electronic Health Record (IEHR) project. “We can interact with patients at the same time because we have a common note. It’s a lot easier for us to communicate as health professionals and deliver a service,” said GP Greville Wood.

The DHB had installed a primary information systems management (PrISM) system across its 16 regional sites and a picture archiving and communications (PACS) digital radiology system, which meant the radiology department had gone filmless and now sent X-rays digitally. This was linked to an iSoft patient administration system and the MedTech GP practice software. CIO Wayne Champion, who championed the IT overhaul, said computerising the clinics meant information could be shared, eliminating the need for separate paper-based records and allowing patient details to be seen across the region. Sharing data with other hospitals and radiology specialists also improved patient care. The DHB was looking to extend the system to dental and mobile health services.[13]

Wellington Hospital had also moved away from conventional X-ray films to the medical equivalent of a digital photo. Capital and Coast District Health Board’s new digital system captured X-rays, CT scans, MRI scans, and ultrasounds in an electronic format and stored them in a central digital archive. The $4 million technology allowed doctors to view the images from any computer once they had logged into a secure system. The technology also allowed doctors to zoom and improve the quality of images on-screen. Health boards using the digital system could transfer and share medical images electronically once access and privacy issues were resolved. Capital and Coast’s system was part of a regional plan with Hutt Valley, Wanganui and MidCentral boards, in various stages of introducing the technology.[14]

Another example of innovation in health was the Late Effects Assessment Programme Information Technology (LEAP IT) project, headed by Christchurch-based Dr Michael Sullivan of the National Paediatric Oncology Steering Group, who created an on-line clinical tool to manage children and young people who had completed cancer therapy. The Child Cancer Foundation sponsored the project with $600,000 over three years, resulting in health passports for young treatment survivors. The system allowed illnesses associated with the long-term effects of cancer treatments to be tracked throughout the life of the patient, leading to better care and understanding of their medical history and at the same time providing a research database to see which kinds of treatments and therapies produced the best results. The passports allowed young cancer survivors to move around the country and the world and take their medial histories with them should any issues arise in later life.

Community health care

A critical component of the Health Information Strategy was the Primary Health Care Strategy (PHCS), including a new ICT framework for a more community-based health care. The strategy had been significantly revised since 2001, and would have far reaching consequences if it was followed through. It proposed a population-based health model with a more co-ordinated approach to health care, education and prevention. It would require a large investment in IT to ensure the sector was sufficiently connected but without duplicating existing investment. A 2007 report said people were already exposed to significant information about health and well-being, much of it over the Internet, but there was “limited access to trusted sources of information that was readily understandable by the public, and limited ability for people to have access to their own health records to maintain and track their own progress against goals.”

The PHCS would give individuals more control over their health care and education and greater access to primary care providers. It would seek to reduce the need for repetition of tests and services, improve the efficiency of moving between providers so there was less need to for them to repeat information. It would reduce duplication of effort by enabling data to be captured once and then securely and appropriately seen where and when it was needed, thereby improving productivity and process efficiencies by automating common interactions. Costs would be reduced by using tests and other information collected elsewhere. Compliance costs for data submission would also be reduced.

Greater access to population health information would enable those responsible for health, including community providers, clinicians, primary health organisations (PHOs), funders, planners, and policy makers, to make more informed decisions. An integrated view of data would be based on standard demographic profiles through the NHI, the Health Practitioner Index, clinical data community providers, general practice, and secondary care. The resulting population health profiles would support decision making and enable better identification of problem areas and planning; assisting in the management of immunisation, determining how many renal dialysis machines would be required over the next five years, identifying medication gaps for those at risk of cardiovascular disease, problem smoking areas, obesity, asthma, and whether diabetes screening programmes were working. New query-based tools would be required along with forums and electronic communities to support population health analysis including intervention information and learning, said the strategy document.

One issue underpinning all future developments is how the Ministry of Health’s next generation national network, now known as ConnectedHealth, will shape up. It is expected to result in improved central health payment systems, better access to national data collections, and better information and connectivity. It’s described in some documents as a global first but few specifics were available in 2007. According to Treasury documents the project, plus associated payment and business systems, were expected to proceed in two stages with stage one from July 2008 requiring additional capital funding of around $50 million and operating funding of $80 million over four years to around 2011.

However the architecture hadn’t been decided and no one was saying whether it would operate independently, follow Macaulay’s model of aggregating bandwidth across accredited providers and vendors, or shift back to making the most of existing systems. Regardless HISAC in its caretaker role of existing networks was looking at how HealthNet might migrate to a new platform if it should become available.

As National Health spokesman Tony Ryall pointed out in October in-fighting between district health boards was jeopardising co-operation within the health sector and costing patients services. In a letter highlighting a breakdown in relationships between the country’s two largest DHBs and umbrella body District Health Boards NZ, Ryall slammed ‘the bureaucratic nightmare that Labour has imposed on the health system.’ He said the 21 DHBs each had a strategy and vision for everything, duplicating huge amounts of work 21 times.[15] Like HISAC chairman Paul Cressey said earlier in the year, the challenge for district health boards was to “do it once and use it many times rather than doing it 21 different ways.” The big, single database approach for individual health records was no longer supported. “Our approach is ensuring we have the right information at the right place at the right time, and that’s at the point of care. It’s more of a bottom-up approach with information from providers following patients as required.”

In his October 2007 report on the Internet economy, researcher Paul Budde said the Western world was facing a massive dilemma in relation to health care. “New technologies are increasing life expectations and improving our lifestyle. The cost of this however is enormous, and we simply can no longer afford to finance these huge advances through the public health systems. In countries with proper broadband infrastructure we see e-health shaping up as a way that will allow us to enjoy these advances in medical technology and medical services, at a more affordable cost.”[16]

If the DHBs were forced to co-operate rather than compete and the Health Information Strategy recommendations were fully implemented by the target date of 2010, our health care systems stood a good chance of full recovery. If e-health innovation at one DHB resulted in better diagnosis, faster decision making, more proactive patient care, and less crowded waiting rooms why wouldn’t you roll it out to the nation? Real-time electronic messaging – transparent information sharing with health data following the patient on an as-needed basis across a secure network – would clearly result in greater efficiency, and less inconvenience for patients, ultimately freeing up huge funds for those cruelly overdue hip replacements, cataract operations, and cancer treatments. So why wouldn’t you act now?

The vision of true e-health, identified well over a decade earlier, like many thousands of desperately ill New Zealanders, remained on an ever-expanding waiting list, needing urgent attention. The question remained whether the government continue to prop up the inefficient systems – technological and bureaucratic – of an earlier era, or move quickly to expedite the safe delivery of connected health and put us all out of our pain.

Outside the sandpit

Entering the e-learning era

It’s been a bumpy 20-year ride, but after endless recommendations and reports the old school walls are finally coming down as leading-edge technology for teachers, administrators, and students is rolled out across the country. Education in New Zealand is undergoing a massive transition from chalkboards to whiteboards and paper mountains of administrivia to Internet-shared resources over a common infrastructure.

Internal school networks have largely been ramped up for the modern era, the use of computers in education is growing exponentially and a major effort has been made in recent years to get broadband into schools. Taking schools beyond dial-up to a minimum of 512kbit/sec was a major task that had been ignored until Project Probe provided the incentives for carriers to come up with a business plan. Some experts are now saying this hardly touched the problem, and gigabit speeds should be the goal to streamline access to a growing pool of e-learning tools and resources drastically needed to meet 21st century educational goals.

The education system, responsible for teaching and equipping a future workforce with appropriate skills for the world they have to survive in, has been slow to recognise the all-encompassing changes technology offers. One of the major obstacles, suggests educational IT specialist Laurence Zwimpfer, has been the Tomorrow’s Schools strategy introduced in 1989, which gave schools individual control over the way they developed, with no idea of what ‘tomorrow’ looked like. While the intention was to make schools more innovative and flexible, and improve the quality of education through local governance and engaging parents, no one saw the computer revolution just around the corner or realised how networking these machines might change everything. At the time networking personal computers wasn’t even on the government’s agenda.

In fact a number of schools were being stung in their eagerness to get into technology from 1986 onwards with the likes of the infamous Ellis brothers and their company Computer Imports selling thousands of Exzel-branded IBM clone computers at 50 percent below dominant PC supplier IBM. IBM wasn’t happy; neither were many of Computer Import’s clients including schools, who faced hugely delayed deliveries or in some cases no deliveries at all. As for support, there wasn’t any. The chaos took the courts many months to sort through, and after the company was sued by IBM for cloning its design it went into receivership late in 1988 and there was little the schools could do about their orphan or overdue machines.[17]

Education on report

A number of reports, including those written by the Education Review Office, the Futures Trust for the government’s ITAG, and the ITANZ, had recommended greater use of IT to improve information literacy. A recurring theme was a lack of a strategy. Schools could buy or were gifted computers and Internet connections on an ad hoc basis. A relatively low proportion of teachers had any IT training and if they did it was without any co-ordinated plan involving hardware, software, and support networks.

Technology education as a separate and identifiable subject can be traced back to 1985 when director of education William Renwick called for a paper on technology education. The Beattie Report (1986) recommended greater funding in science and technology and stressed the economic and technological ends of education. In 1988–1989 a number of exploratory projects were developed in technology education; however, any real moves to action only gained ground after Lockwood Smith became minister of education. Two important influences in the process, which led to his seminal 1991 Budget statement, were Treasury, and the authors of the ‘Porter Report’ on the New Zealand economy.[18]

The conclusion of the Porter report, that New Zealand must become more innovation driven, struck a sympathetic chord with Smith, reinforcing the economic policies Treasury was promoting to him. In his July 1991 statement, Smith outlined the need for: ‘…a clear policy to enhance educational achievement and skill development to meet the needs of a highly competitive, modern international economy …’ and a commitment ‘for the modern, competitive world.’[19]

Across the sector there were a wide variety of initiatives, some initiated by government, some by companies and community groups, and others by schools themselves. Most were localised, and none were part of a comprehensive strategy designed to address all concerns of a school planning to implement IT more widely. Then in 1992 Prime Minister Jim Bolger woke up to the fact that the government should be using telecommunications in education. A handful of glossy reports pictured us leading the way into the 21st century with a highly skilled workforce. In reality we were importing more skills than ever before across a broad range of industries, and only a few measly million had been spent on practical projects to bring technology education into the real world.

Although Lockwood Smith had requested a technical curriculum in 1991, a draft statement wasn’t devised in 1993 and it was then put out for further discussion, trials, and pilot schemes. It wasn’t until 1995 that technology became part of the school curriculum.[20] The 80-page ‘Technology in the New Zealand Curriculum’[21] publication was broad ranging, highly philosophical and the term ‘technology’ ranged across fields as diverse as graphics design, biotechnology, electronic and control technology, food technology, ICT, materials technology, production, and process technology. The term ‘computer’ was used 20 times and ‘Internet’ once.

The 1996 ‘Telecom Education Foundation’ report revealed there was one computer for every 19 primary school students and every ten secondary students. Half the computers were more than three years old; 40 percent of primary school classrooms and 56 percent of secondary school classrooms had modems. Most schools intended to be connected to the Internet by 1998.[22] Education Minister Wyatt Creech and his predecessor Lockwood Smith had waxed liberally about the need to equip schools with technology and ensure students were ready for the brave new world. Maurice Williamson, however, insisted their approach required a radical rethink.

“Teachers want money for computers and software but they want to use technology in the old system. If business is going to automate you don’t stay with old manual systems as well. You can’t spend hundreds of millions getting information technology into education and still spend thousands of millions on the old regime.” But why should it be up to the teachers, or even school boards for that matter? And expecting the private sector to cough up for everything, unless we were planning to privatise education, was a bit rich, he said.

In 1996 Telecom offered all schools a second telephone line at no charge. IBM, Apple and a host of technology companies contributed equipment, expertise, and time to get some schools on track. The communications industry rallied around to help 36 schools in Wellington install cabling in 1997, and for a couple of years schools were being targeted by vendors nationwide through NetDays. Surveys showed that cost and a lack of expertise in integrating telecommunications into the learning and teaching process were the main obstacles to providing technology in schools. Wealthier boards ensured their schools had nothing but the best but the less advantaged often struggled with the basics, let alone how they might establish an Internet presence.[23]

In 1997 Maurice Williamson’s ITAG, together with ITANZ, published ‘Impact 2001: Learning with IT – The Issues,’ which identified a number of problems affecting the ability of the New Zealand education sector to make use of IT. In 1998 ITAG consisting of 13 IT executives, academics, and other iterested in the impact of IT on society, presented the report proposing strategies for change.

ITAG wanted to lay a foundation for primary and secondary education to prepare for the knowledge society. If adopted this would mean all New Zealand schools would use wide area and local area IT networks for learning and administration. Teaching resource material relevant to all parts of the curriculum would be available on-line. Teachers would have ready access to technical IT and training support at no additional cost to their school. All teachers would be aware of the potential of IT in learning and have the equipment and skills to use it.

ITAG wanted schools to have adequate income to permit IT purchase and maintenance by 2000. All students would be taught information gathering and analysis skills using modern technology. Training programmes would be staffed by teachers who made appropriate use of information technology and were confident in the use of electronic information resources. All students would regularly use IT tools and information resources across a variety of curriculum subjects, and the potential of IT in special education and in teaching in te reo Maori was to be developed.

While most schools had Internet access at some level, it proposed wider access with local caching of commonly used resources to keep down costs. It questioned the exclusive use of paper from a variety of sources and encouraged the use of electronic resources. “It is not clear that government understands in any detail the changing information needs of schools, which makes it hard to determine the best way of meeting those needs. It would be reasonable to assume that almost all information going to and from schools could, and should, be delivered electronically, which would both improve its usefulness and save money.”

ITAG proposed the government fund a web site containing resources useful to teachers which could be maintained centrally or in a distributed fashion. Contributions from all teachers would be encouraged. Resources might include lesson plans, curriculum materials, links to useful sites, and teacher-only discussion areas. “To some extent this is beginning to happen with the curriculum, teacher contracts and other material being made available on-line by the Ministry of Education. However it should in ITAG’s view be formalised and resourced,” said the Impact 2001 report.[24]

Auckland-based author and educator Gordon Dryden thought the government needed to invest a billion dollars in basic information technology, or $400,000 each for the country’s schools. “There’s a leadership void. There’s no way we’ll become the most skilled nation without spending money on technology. You can’t get away from the fact there needs to be an initial cash injection from the government for basic technology and training.” Teaching technology skills and having a technology infrastructure were no longer options but basics. Yet a piffling million dollars were made available for IT training and development of teachers for both 1998 and 1999. While the new national technology curriculum, devised over the six years, became mandatory in 1999, the country hadn’t even begun to establish a national infrastructure to practice what we planned to teach.[25]

Social engineering

There were many vocal critics of technology education becoming compulsory in the New Zealand curriculum but most had misconceptions, said the IPENZ in its July 2001 report on the role of technology in education.

Education systems were still essentially training people for the industrial age and an urgent revolution in learning was needed if we were to make the most of the information revolution, warned futurist David Pearce Snyder, in the country in 2001 for the Ninth International Conference on Thinking. “In the industrial age people were mostly employed as extensions to physical machinery. In the information age we’re all going to be using knowledge to do our jobs better. The fact is that 70 percent of the value added in the productive process comes from the information input – knowledge, design, skill, research and analysis.”

He said only about 30 percent of the population could learn effectively in a passive, auditory mode by listening to a speaker; about 70 percent learned faster in an active applied setting. “The classroom only imparts real learning mastery to about one third of the graduating students. That was fine for the industrial age but now everyone is going to need those high-value skills. People learn faster by doing. Apprenticeship, internship, mentoring, community projects, teamwork, project assignments, team teaching and active learning is essential and has been shown to be so in the most successful innovative schools in the US and Europe.”[26]

Education was going through a turbulent time. In July 2003 there were 2693 schools but dozens of smaller country schools, and those with rolls that hadn’t been increasing, would close in the next few years. Teachers around the country were frustrated at the increasing demands on their time and were lobbying for better wages and conditions. In 2004 the Ministry of Education revised its strategy, admitting that the Internet would indeed transform the way the various agencies worked together and deliver their respective services. It put its weight behind the development of an ICT framework to guide the direction and co-ordinated use of ICT across the sector. It had 55 ICT-related projects to co-ordinate, many of which were e-government related.

The ministry had an ‘audience-based’ Web strategy with four key portals: LeadSpace for governance and administration information for school leaders; Te Kete Ipurangi (TKI), with bilingual information for teachers on curriculum; the education gazette and training courses, and content shared with Australia. Then there was Tertiary e-Learning (TEd), used by tertiary providers to access information, policies and complete data returns and e-learning for students, including course information and electronic delivery of courses.

The ministry was also developing software for education providers, including tools in the e-Admin programme for schools and early childhood education providers, designed to automate the collection of funding and staffing information. It was collating data from education providers to offer better access across the sector. A comprehensive redevelopment of internal information management systems was planned for 2005–2006, including a document management system that would enable better Web content management as well as enhanced software for the development of web sites and portals.

Between 2006 and 2010 learners and teachers were to be at the centre of their own communication and information networks. That meant making the best use of ICT across the curriculum to connect schools and communities, and support evidence-based decision making and practices in schools. Access to sustainable and well-supported ICT infrastructure in schools and across the education system was becoming fundamental. Through the ministry’s e-Admin programme and initiatives from the NZQA, schools were increasingly carrying out core business processes electronically. This required an increase in the use of electronic student management systems.

In the past there had been a wide range of incompatible systems and ad hoc timetable and contact management. From 2002 an accreditation programme would ensure they would all work together. The ministry funded 50 percent of the transition costs from proprietary systems to accredited student management systems.

Truant detection

By December 2008 it was expected that 98 percent of state and state-integrated schools would be using accredited systems. This would give teachers, school administrators and managers, and education agencies better student management and information and deliver improved education outcomes. A central electronic register would replace the current paper-based exchange of enrolment data between schools, improving enrolment management. With such a system in place students not enrolled in a school within 20 days of leaving their previous school would be identified for the necessary attention.[27]

The e-Learning Action Plan, part of the overall ‘ICT Strategic Framework for Education’ published by the Ministry of Education in 2005, addressed the challenges facing the education sector as a whole. It was aligned with the e-Government Strategy and the Digital Strategy. The e-learning plan was built on two ICT strategies for schools: Interactive Education, established by the Ministry of Education in 1998, and Digital Horizons, launched in 2002. It included the work and priorities of schools and the Ministry of Education as well as other agencies, in particular, the National Library of New Zealand.

The foundations were being laid for effective and strategic use of ICT in schools by providing professional development for educators, appropriate on-line learning resources, the building of networks, the use of software and hardware, technical support and broadband access. While there were many examples of highly effective practices, these weren’t yet fully embedded into everyday teaching practice either within or between schools, and changes in teaching practice were not yet systemic, said the action plan. The challenge was to ensure knowledge about effective teaching was rapidly spread and adopted throughout the school system. “Just like the ability to read and write, ICT literacy will be an essential life skill – an economic and social necessity. Without ICT literacy, there is a risk that people will be cut off from job opportunities and unable to take part in the full life of the community.”

Students would be supported in achieving their full potential through extending and enriching educational experiences across the curriculum as they became proficient in ICT literacy skills and developed the confidence and key competencies for independent, collaborative, and lifelong learning. Resources available to them included highly interactive digital learning objects, web sites such as Te Ara: The Encyclopedia of New Zealand, videoconferencing, CD-Roms, DVDs, and virtual field trips. Teachers could also use networked assessment data and tools such as asTTle (Assessment tool for teaching and learning) and the New Zealand Curriculum Exemplars.

To ensure schools could run this level of technology in their classrooms and share resources between classrooms and over the Internet the right infrastructure had to be in place. Like computing, the network capabilities of different schools across the country had evolved in an ad hoc manner. A Computer Network Survey of schools’ infrastructure was carried out between June and December 2004 so the Ministry of Education could begin to prioritise upgrades. There were 36 pilot upgrades to evaluate the issues and test the proposed new standards, ahead of a wider roll-out.

TorqueIP, a specialist division of Connector Systems, was selected as the prime contractor to install cabling, switching and a server, aimed specifically at meeting the needs of small schools. Around 300 smaller schools with no network infrastructure were invited to apply to establish a network infrastructure. Each got a package deal: Molex certified cabling installation with a 20-year warranty, a 24-port Allied Telesyn Ethernet switch, an Acer server running a customised SUSE Linux operating system from Smart Computer Systems in Christchurch, system administrator training plus network design and project management of the installation. Work was due for completion at the end of 2006. Schools were expected to contribute a percentage towards the costs based on the size of their rolls. A school with 77 students, for example, would be required to contribute 20 percent of the total upgrade cost, typically $4000–$6500.

There was further funding for a second stage of 70 schools with rolls between 130 and 260, with little or no network infrastructure. This began in April 2007 and was due for completion at the end of the year. All up $16 million had been set aside to upgrade about 400 school networks. Funding was also to be sought for additional assistance for upgrading school networks and ICT infrastructures in the 2008 Budget. One of the main issues arising from the Computer Network Survey of schools was the lack of proper design, particularly in small rural areas where many schools had been let down by sub–standard installations. Using Ministry of Education, recommended suppliers and certified installers ensured a high-quality network infrastructure.[28]

Telecom and other carriers had steered well clear of moving into areas they considered unprofitable and that included the majority of schools in outlying or rural areas. Getting Internet at reasonable speed and cost was a challenge the MoE and the MED had tried to address earlier on. They were determined to find a way to deliver broadband Internet to all schools and provincial communities. The MoE wanted rural schools to have fast access to digital teaching resources located in the cities and offshore, with the objectives of ‘improving administrative efficiency and teaching effectiveness, enhancing the professional development of teachers, expanding e-learning … and providing a wider curriculum choice and teacher expertise, including two-way videoconferencing.’

It had already explored the options with Telecom but received a lukewarm response, the carrier believing provincial services were a financial liability. Its proposed budget of $28 million was also beyond the ministry’s financial resources. The MED’s complementary objectives were to strengthen the rural economy by giving local authorities, tourist organisations, the health sector, and businesses high-speed Internet, and generating a competitive telecommunications market outside the main cities. MED’s concern was that provincial New Zealand’s would languish without high-speed, price-competitive Internet access.

After 18 months of studying rural broadband needs, $39 million in funding for Project Probe (Provincial Broadband Extension) was allocated in the 2002 Budget, with the objective of ensuring that all 900 outlying or rural schools in New Zealand had access to broadband Internet services.

Although the original intention was that each project would be let by the Probe Steering Group (PSG), three regions: Southland, Northland, and Wairarapa-Tararua, proposed their own roll-out but still qualified for funds. Southland had been working on broadband access for the region for some time and as it was more advanced in its preparation it was first to go to tender, which was won by wireless carrier Woosh. Telecom then stepped up its game and began showing interest in delivering regional broadband. The roll-out of the Woosh network in Southland was seriously delayed as the company ran into major resource consent issues, and was forced to start from scratch, negotiating with local councils before it could roll out its infrastructure. Woosh came close to winning the contracts for Northland, Canterbury, and Wairarapa-Tararua but withdrew at the last moment, largely because of ‘capacity constraints.’

Telecom moved quickly to take over all these contracts, proving it could provide the capacity with only incremental additions to its existing telephone infrastructure. Telecom’s technology also proved cheaper to implement, which saved the project around $4 million. The company won all but three of the remaining projects; these went to Counties Power, Pacific.Net, and Iconz, which won the satellite contract. By December 2005 the use of videoconferencing had expanded rapidly, with 80 schools involved in delivering or receiving videoconferencing enhanced courses. In 2006, 130 courses were available.

The government debrief found a number of issues could have been managed better if the steering group had undertaken an earlier evaluation process and there had been more details in the planning documents. The Ministry of Education was constrained in its resources for preparing schools to add value to the services, and a lack of support programmes for videoconferencing also slowed the uptake. The report said there was an ‘excessive expectation’ of additional funding from local government.

“Their planning and budget cycles were too slow and they were unable to allocate funds in sufficient time.” The quality of input from the regions was highly variable and in some cases entirely inadequate. “This required a good deal of assistance by Probe staff, which cost the project as much as 12 months.”

All but a couple of the contracts were completed on time, although the primary objectives – to provide broadband services to provincial schools, local government, and business interests and to generate a competitive IT environment – were completely achieved, the report concluded.[29]

Altogether 895 schools had been involved in Probe at a cost of $45 million and most had been completed by the end of 2005. The last Probe school to get connected, nearly two years after the official project closed, was Wakanui School in Canterbury. “It’s good to get it. We’ll be able to go on-line without using a phone line. Downloading information was so slow and the system would kick us off if we tried to watch video clips,” said principal Mike Hill. Telecom had to install new access technology in the local exchange to extend DSL into the area which had two teachers and 46 pupils. An estimated 60–70 schools across the country were still without high-speed Internet connections in August 2007 and the Ministry of Education was carrying out an audit to verify the details.

Piecing it all together

The next logical step was how to bring all those upgraded internal networks together by connecting schools and tertiary institutes to a nationwide high-speed network, to enable sharing and collaboration through the use of e-learning tools and Web-based resources. The ultimate goal was an open access schools network that met international standards, including access to a range of Web services that could adequately connect to the KAREN network, for access to higher research and education institutions.

Mark Horgan, senior programme advisor with the education sector ICT group, was committed to a single common infrastructure throughout the education system. He was appointed in late 2004 to work for the CEOs of the Education Sector ICT Standing Committee, to align services and ensure interoperability across the sector. The Education Sector Review, released in July 2005, had recommended the Ministry of Education, NZQA, and the Tertiary Education Commission work more closely together, reinforcing the concept of the ‘virtual agency’ with each agency contributing to the outcomes of the bigger picture.

Horgan, a knowledge management consultant with considerable offshore experience, said the move to upgrade internal networks across the nation’s schools had resulted in a “surprising increase in the level of teacher confidence.” They were now more comfortable that the technology would do what it was supposed to in the classroom. The implementation of The ICT Strategic Framework for Education would take things a step further, creating a strategic alignment from early childhood to tertiary through ‘smart’ use of ICT. Pivotal to this was the Education Sector Architecture Framework (ESAF), which would enable wider collaboration and sharing of resources across a nationwide network tentatively known as ConnectedEducation.

ConnectedEducation included a data model with common standards for information integration and exchanging and moving data across different platforms. There would be a search capability across local and international electronic catalogues, resources to make it easier to locate information. An Education Sector Authentication and Authorisation (ESAA) code for secure access would ultimately be extended so parents and caregivers could view their children’s work and results.

The National Student Number (NSN) had been in place for year 10 to tertiary level students since the National Certificate of Educational Achievement (NCEA) was first introduced in 2002. The number was based on a National Student Index (NSI) database system maintained by the Ministry of Education to enable qualifications to be associated with each student. In mid-2006 legislation was passed allowing the number to extend to children from early childhood through to year 10 so students could be tracked through the whole school system for longitudinal analysis of their data. It was also foundational for shared services interconnectivity so student records could travel between schools and systems regardless of the vendor or student management system involved, and courses would be able to move between e-learning management systems.

ConnectedEducation would deliver a range of e-learning and Web-based resources including the Virtual Learning Network (VLN), a brokerage service based on a ‘classrooms without walls’ concept, where students and educators had flexibility to connect with their classes 24 hours a day, seven days a week. From the site a diverse range of courses, programmes, and activities were offered by New Zealand educators. The site brokered connections between teachers and learners, joining clusters, schools, groups, and individuals who were learning through on-line programmes, including videoconferencing for curriculum support.[30] Then there was the edCentre portal, a gateway to information about local education delivering access to educational services tools and resources, and advice for organisations and people at all stages of their learning lives.[31]

In September 2007 it was announced that $11 million would be made available through to 2010 for 122 schools to improve the quality of teaching through better use of technology. The funding would go towards providing audio- and videoconferencing for Correspondence School on-line learning, developing school blogs and a technology expo. “E-learning strategies can place teachers and students at the centre of their own communication and information networks. This helps create a flexible system where teachers, schools and communities can identify the needs of learners and use technology to boost their achievement,” said Education Minister Steve Maharey.[32]

Although the ConnectedEducation network was still at a conceptual stage, there was a close alignment with the ConnectedHealth initiative and the GSN, and the advanced KAREN network seemed a good fit under certain circumstances. With common standards and a willingness to embrace third party networks or suppliers, the goal was to extend fibre or other broadband links to all schools and agencies. It was hoped aggregation of requirements across health and education would encourage suppliers to make more of an effort to get to rural and remote areas, and close the gaps in and between the growing number of fibre loops around the country. Education would specify the levels of service and standards required, including security, so third parties could be certified to expand coverage. The broadband challenge and MUSH urban networks evolving across the country were ideal partners to help bring high-speed fibre networks to all locations where the four networks had potential clients.

PROBE not deep enough

While pleased with the success of Project Probe in extending networking to many outlying schools, Horgan described it as ‘a lowband exercise’; Probe II would need to deliver a lot more. The darling of the education community, which had become a template for the way education would like to move forward, was ‘The Loop,’ a gigabit-speed community covering the top half of the South Island, which had grown outward from Nelson.

A loose initiative known as the Super Loop, an informal gathering of mainly educationalists, which wanted to ensure educational connectivity, occurred in an ordered way, taking lessons from Nelson and other locations where schools were accessing gigabit connections. The Ministry of Education was funding a series of reports and guidelines covering governance, business processes in funding, and internal technical issues including interoperability. “We don’t want people developing solutions that will make it harder later on,” said Murray Brown, the Ministry of Education’s manager of e-learning. “We need to ensure various broadband networks around the country listen and learn from what various people have done over recent years so they’re better prepared and don’t go reinventing too many wheels.”

Viable economic reasons were needed for investment in networks, including creating aggregated demand. “The Nelson situation was rather unique because the local lines company gifted spare broadband capacity and that’s unlikely to be repeated in other parts of New Zealand. It needs a lot of collaboration and co-operation between local authorities, economic development agencies, education and other users. It is the sort of capacity and capability that many regional and local body agencies don’t have a lot of skills in, so it’s pretty challenging work.”

Brown said the Broadband Challenge fund had been helpful in providing resources and support, and he was hopeful the reports from the Super Loop group would help with future decision making. People were starting to realise schools were important early users of new technologies, particularly in film, video, and multimedia where they were often higher capacity users than general business. A range of services and resources including videoconferencing, rich media objects and streaming video were already rolling out to schools over the public Internet but with the loops, speed, access and reliability were no longer issues.

There were now bulk arrangements for software licensing and purchasing, reduced compliance costs, and new standards across networks and IT. Brown said government had been investing about $10 million a year in professional development for teachers over seven to eight years, including $18 million a year on 38,000 laptops for teachers. “It’s been a matter of trying to get a balance between the things that can be done at the centre and what the schools need to take responsibility for. Certainly in the past decade we’ve had a more co-ordinated approach but up until then it was pretty much hands off,” he admitted. The overall goal was to have the on-line resources and services available for teachers and kids in learning spaces wherever they are in that particular moment.

Education networking specialist Laurence Zwimpfer said schools had been encouraged to head down the IT path, but most hadn’t been given the tools to do this. They were trying to run fairly sophisticated businesses with a large, demanding body of students and teachers, but IT was often seen as an interesting add-on that didn’t quite fit the school curriculum. “The reality is IT is now a critical component in running our schools – even more critical in terms of giving kids the sort of education they need, yet we’re still running it in a fragmented way on a shoestring budget.”

What was needed was a similar approach to the GSN; in other words a highly secure, high-speed nationwide network backbone all agencies and schools could hook into for shared access to resources. “I took the idea of a school network to former education minister Lockwood Smith in my days at Telecom, thinking this was an offer he couldn’t refuse. He said I would have to go out and sell it to all the schools individually and at that stage there were about 2700 schools. I worked out how long that would take me and figured I’d be long gone by then.”

Zwimpfer said a much broader vision for what could be achieved with technology in the education sector was needed. While Project Probe solved many problems and gave everyone the opportunity to connect, the next mission should be to scale that up hundredfold. He said the government-funded multi-gigabit KAREN network, serving the nation’s universities and Crown research institutes was a good example of what could be achieved for other sectors. “Local authorities, health and education need access to bandwidth like they need access to a building. You need different pricing models and the telco world has been slow to recognise that.”

He said a network of 2700 schools is not big in global terms, and a big picture vision for all schools to become part of such an approach was necessary. There had been progress in some areas, but the roll-out of fibre-optic loops to urban areas in Nelson, Christchurch, Auckland, North Shore, and Wellington had been painfully slow. While Project Probe was supposed to be the catalyst to involve entire communities, once the schools got connected, the incentive was not there, particularly if the schools only had 512kbit/sec connections.

Connecting communitites

The next stage needed to take into account what was happening in the wider community. “There wasn’t enough thinking about how the capacity would be locally reticulated. It was a bit like what New Zealand did in the Pacific Islands in the 1970s. We put in satellite dishes in one location and told – they were now part of the 21st century but no thought was given about how they were going to provide a service to other parts of the island or other islands within the group; 30 years on we still haven’t solved that problem.” Zwimpfer said the country was at risk of doing the same thing with schools. The government was still saying, “Look, we’ve put in a broadband connection into a school in your community, so what’s your problem?”

The closest thing to a solution, he suggests, is exemplified by the service provided by Waikato University to the Tuhoe people in the foothills of the Urewera Ranges, connecting their school hubs and consequently local homes in the community using ‘good old wi-fi.’ “It’s a very bottom-up, community-owned process that certainly didn’t get a lot of help or engagement from the telcos. That’s not necessarily a bad thing because you end up with another model of locally owned fixed price infrastructure rather than a user-pays one.”

That solution was partially funded by the CPF, promoted by the 20/20 Communications Trust, of which Zwimpfer is a member, and owned by the Tuhoe Education Authority. “If Telecom had tried to go in there and put up repeater stations on sacred mountains they’d probably still be negotiating resource consent and opposition from the locals; instead the community couldn’t have been more helpful because they have ownership and can see the immediate benefits.”

Zwimpfer said many city schools were now on a gigabit path where they could collaborate and share resources, run shared server farms, and potentially connect into KAREN and internationally. “This is happening in the city areas where we’ve got fibre but it is creating the same sort of gap we had five years ago between the city schools with their 512kbit/sec connections and the rural schools with a dial-up 24kbit/sec connection, if they were lucky. We need to run the Probe process again because the market is not suddenly going to put huge investment into Ruatoria and other remote areas. We don’t want to wait another ten years until everyone is screaming again; we need the government to come in, make the decisions now and get on with the investment.”

It was essential to keep the gap from widening, otherwise there would be major problems ahead for education. “We already have this situation where some kids are getting access to a whole range of sophisticated opportunities and others are not. Some schools are able to run more efficiently; for example in Wellington, where they are moving to professionally managed and backed-up shared server farms. This means they can get rid of their own server infrastructure, which they have been trying to manage with meagre resources, while paying technicians a fraction of what they might earn in the commercial world.”

In the United States about 25 states had developed their own Internet-based curriculum, and about one third or 6000 public schools were offering credits for on-line classes, according to the US Department of Education. An estimated 700,000 students took these. Marc Prensky, a New York trainer, educational games developer, and visionary[33] who spoke at the International Confederation of Principals in Auckland in April 2007, said for the first time schools had serious competition.

“Today’s education is bifurcating. There’s the school with its certificates, and outside school there’s another world, an on-line world, that kids find exciting. Here they learn without us. They’re highly motivated. They work with peers. Things actually happen. The only way for schools to compete is to involve students with everything. We can no longer just tell students what is right or best. We have to ask them what they think … Have kids’ representatives at parents’ nights and faculty meetings; take them to conferences. I’m talking smart kids who will give their point of view. We have to involve students in everything we do.”

Prensky, who has created more than 50 educational games, said games were the paradigm for learning with engagement. “Complex games that take time to play, involve lots of skill, goals, learning and decision making. How come in a class of 45 minutes students haven’t made 50-100 decisions and got feedback? In a game you make them every half second … The goals in class are ‘do this’; the goals in games are always ‘be a hero’ … there’s a definite motivation there.”

Computing the key

His response to people who throw their hands up in the air and say, “The future is going to be so different, what are we going to do? We don’t even know what people’s jobs are going to be,” is, “Well that’s true for 100 years time – but when these kids grow up they’re going to be interacting with others via technology. No matter what the job they have, they’re going to be connected by some kind of computers. Its stuff you don’t have to be a tremendous seer to predict but we’re not teaching any of that … Kids used to grow up in the dark. When they went to school, teachers gradually showed them the light. That’s what was so noble about our profession. But today kids grow up in the light – they learn what’s going on from the Internet and TV, and they have cellphones. Teachers don’t want to engage with the kids on different stuff; they want to engage the kids on old stuff. So we’ve got a real problem with school becoming less and less relevant. Even with the best intentions we put them back in the dark.”

So what was his vision for education? asked Interface editor Greg Adams. “It’s about everybody reaching their potential … That means the ability to go in the directions that inspire or interest someone, but to be guided at the same time, to work with peers, to follow your heart and your instincts, and basically have a good time growing up. Not everyone’s going to be an intellectual person, but people should, at the very least, not be afraid of learning. Everybody has interests and should be able to know that they can follow those and find others who share those interests. Somebody said it really is not the knowledge revolution, it’s the connectivity revolution. The more we help young people understand how exciting it is to get involved the better. We’re in such interesting times to grow up in.”[34]

Meanwhile there remained concerns about future funding to expand technology use in schools after Telecom pulled out of a 14-year deal that saw a small percentage of parents’ phone bills go towards increasing the digital capabilities of primary and secondary schools. The School Connection sponsorship programme had raised $120 million over that period and left many poorer schools in particular wondering how to replace what often amounted to tens of thousands of dollars from their already stretched budgets. If that wasn’t enough it was announced in September that it had cost parents close to $54 million more in 2006 to provide additional computer equipment and other essentials for schools. Through fundraising $80 million had been raised, up from $65 million in 2005.

The 30 percent hike in funds from parent donations and school fundraisers was mostly raised at fairs, auctions, and gala balls and went on running sports and music programmes after school, some building work and scholarships. Parents had already given $155 million in fees during 2006, a $40 million increase from 2005.

National Party education spokesperson Katherine Rich said communities were under constant pressure to contribute to ‘free education.’ School Trustees Association president Lorraine Kerr said parents had traditionally raised money for the ‘nice to haves’ but were now helping to buy necessities. “I think it’s pretty poor that parents have to dig deep to subsidise what is supposedly their right.” She said the government had spent about $60 million on information and communications technologies in schools. “But we also know that parents have had to raise another $10 million to top that up.” School principals said they had to use a range of methods to supplement public money and provide extra resources.[35]

Meanwhile the way the government allocated funding to schools was also clearly out of step with how many smaller communities operated. For example Apiti School in the outback of north-west Manawatu was savaged by a huge reduction in per child funding from around $96 to $22 per child in 2008.

It was one of 14 schools in the country to face a change in decile status by the Ministry of Education based on old census figures, which Apiti principal Mary Cumming could only put down to the loss of one family in 2006. Since then the school had added two large families, which meant it now had to cope with more children than previously with less funding. A number of schools appealed.[36]

Building capability

As far as Internet pioneer John Hine was concerned, the greatest need in the education sector from primary through to secondary and tertiary was to continue to build capability, an area where persistence made all the difference. He was on the board at Wellington Girls College when the laptops for teachers programme was introduced, and helped start the Tech Angels programme where students show teachers how to use the technology.

“I was just amazed how that school changed in three years. In the first year you could point to teachers putting the laptop in their bottom drawers and wondering why they had been given it. In the second year they were preparing their assignments on it, and the third year the students were using laptops in their classes. We saw foreign language students reading foreign newspapers on-line from their laptops, and a German or French article projected on a big screen for discussion. Those changes happened pretty quickly. Once the equipment was there it took a bit of willpower, and a clear vision to make sure teachers didn’t forget what they should be doing with these laptops. That sort of thing has to happen in schools all over the place and it has to stick.”

The risk, he said, was if the person who had been driving the technology moved on, a programme could flounder. “You really have to keep developing that capability,” said Hine. Similarly, he said, the uptake of technology in health could make a major difference in the way things were done, but that would require finding a way of getting past the issues of privacy and sharing personal information.

“There’s a lot to be gained in the health sector; for example when you realise that something like 30 percent of the health budget in 2006 was spent on fixing things that were going wrong. If you can reduce that you can put more money into more operations.” In the end he said people needed to have confidence, they needed to see how secure a successful banking operation keeps its customer information, and how this might impact on other sectors.

Footnotes

[1] Rebecca Palmer, ‘Emergency wards “already in disaster mode”,’ Dominion Post, 6 August 2007

[2] Carol du Chateau, ‘Emergency department hell,’ NZ Herald, 30 June 2007

[3] Rebecca Palmer, ‘Hundreds more surgeons needed,’ Dominion Post, 23 August 2007

[4] Auckland Hospital accepts blame,’ TVNZ, 1 May 2007

[5] Tom Pullar-Strecker, ‘Patient barcodes all but certain,’ Dominion Post, 20 August 2007

[6] Ministry of Health, ‘From strategy to reality: the WAVE project,’ report of the WAVE Advisory Board to the Director-General of Health, Wellington, New Zealand: Ministry of Health; 2001.

[7] Office of the Auditor General. Progress with priorities for health information management and Information technology. 2006. Available at http://www.oag.govt.nz/2006/wave/docs/wave.pdf

[8] An internal secure Internet-style network

[9] Stephen Bell, ‘Health pushes for broadband,’ Computerworld, 21 May 2007

[10] Orion Health signed a contract for its 140th Rhapsody clinical workflow and integration engine site in North America at the end of October 2007 when it won a deal with Marion General Hospital, Indiana. Orion’s has 215 sites such sites worldwide, including the Centres for Disease Control, the US Department of Agriculture, Central Washington Hospital, and New York State Department of Health. Orion has 65 staff in its Santa Monica office, planned to open a second US office in Boston. Orion made the North American list of the top 100 health care IT providers for the second year in a row. Source: Ulrika Hedquist, ‘Orion wins 140th US contract,’ Computerworld, 25 October 2007

[11] Keith Newman, ‘Web innovation to unlock access to critical patient data,’ m-net, 12 June 2006

[12] Martin Johnson, ‘Wired for saving lives,’ NZ Herald, 25 August 2007

[13] Darren Greenwood, ‘Health looks to ICT to deliver better patient care,’ Computerworld, 28 June 2007

[14] Rebecca Palmer, ‘Medical X-rays and scans go digital,’ Dominion Post, 7 July 2007

[15] Press release, Ryall, Tony, MP, ‘Bureaucratic infighting costing patients,’ 22 October 2007

[16] 2007 Global Internet - Volume 1 – ‘Web 2 Revives Internet Economy,’ Paul Budde Communications, October 2007

[17] Keith Newman, ‘Exczel owners get warranty offer,’ Computerworld, 23 January 1989

[18] Crocombe, Enright, & Porter, 1991

[19] ‘Technology: flogging a dead horse or beating the odds?,’ Brent Mawson, Head of Centre, Technology Education, Auckland College of Education (1998)

[20] Dr Maris O’Rourke, Secretary for Education, Technology in the New Zealand Curriculum foreword 1995

[21] Technology in the New Zealand Curriculum, Learning Media, 1995

[22] Keith Newman, ‘Schools flunk technology,’ Metro, June 1998

[23] Ibid

[24] ‘ImpacT 2001: Strategies for Learning with Information Technology in Schools,’ A Submission to the New Zealand Government by the Minister for Information Technology’s Information Technology Advisory Group 1998

[25] Keith Newman, ‘Schools flunk technology’

[26] Futurist lifestyles editor David Pearce Snyder in Auckland, New Zealand for the Ninth International Conference on Thinking, January 2001

[27] ‘Enabling the 21st Century Learner - An e-Learning Action Plan for Schools 2006–2010,’ Ministry of Education, 2006 Learning Media Limited: http://www.minedu.govt.nz/Web/downloadable/dl10475_v1/itc-strategy.pdf

[28] School ICT Network Infrastructure Upgrade Project, Ministry of Education

[29] Government case study, 11 January 2006: http://www.e.govt.nz/resources/research/case-studies/project-probe/project-probe-casestudy.pdf

[30] http://www.virtuallearning.school.nz

[31] http://www.edcentre.govt.nz

[32] NZPA, ‘Schools given $11.1m for e-learning,’ 18 September 2007

[33] Cited as one of training’s top 10 visionaries by Training magazine

[34] Greg Adams, ‘Involving students in everything we do,’ Interface magazine, August 2007

[35] Martha McKenzie-Minifie, ‘$54 million more for “free schools,”’ NZ Herald, 1 September 2007

[36] Jody O’Callaghan, ‘Decile change rocks school,’ Manawatu Standard, 19 October 2007