Software Auditing: A New Task for U.K. Universities

Universities in the United Kingdom have now becomedependent upon software for teaching, learning and administration. Inthis explosive growth, pressure has arisen from vendors todemonstrate that software is still being used legally. Highereducation is also seeking to better manage software as an "asset."This article discusses how these factors have led to one pilotproject that aims to assist its peers in "softwaremanagement."

U.K. universities have long benefited from theeducational pricing of software, and vendors have applied appropriatelicence conditions to those sales. Most recently, the issue of"concurrency" has been debated, that is, the running of more than onecopy of a software program over a network. But with (at best) staticfinancial resources available to central computing services, bettermanagement practices are sought. Vendors are keen on this, arguingthat the payback for cheap software has arrived in the form ofsoftware auditing to demonstrate legality and an absence of softwarepiracy.

But there is a "carrot" as well as a "stick" to anaudit. Software auditing creates a large volume of useful data thatcan be used to populate IT (information technology) asset-managementsystems. Thus better answers can be obtained for:

  • What level of support should our help desk provide for application A?
  • Which PCs need upgrading to run a new version of the OS?
  • What is the dollar value of the software we possess?
  • What would be the implications of standardising on word processor B?
  • Who are using old versions of software package C?

Software Auditing Defined

Software auditing is the process of determininghow many copies of a software program are installed, on bothworkstation and server hard disks, then comparing this total to thenumber of copies allowed by one's licensing agreement with thevendor.

A two-month pilot project on auditing in academiawas initiated and hosted by Exeter University. Evaluating possibleaudit tools and drawing together other pertinent information was donevia the Web. Much new information was added, and U.K. highereducation institutions expressed strong support in an e-mailsurvey.

Exeter's Implementation Path

There are essentially two approaches toimplementing auditing. First, policy approval and procedural issuescan be devised, followed by institution-wide roll-out. Conversely abottom-up approach can be taken: local procedures are constructed,funding is obtained for sample departments, then areas requiringclarification and management decision are identified. This latterapproach was adopted at Exeter.

The university's computing facilities and servicesare overseen by the Computer Users' Group (CUG). At a meeting, it wasproposed that a sub-committee (Software Audit Group, SAG) beestablished. SAG would explore the issues and practicalities ofauditing at Exeter, would hold a budget to audit sample departments,and thus best practice would be established. Policy documents and aProcedures manual would be produced as part of the "deliverables"from this national pilot project on software auditing.

The CUG allocated funding for auditing the centralcomputing service, IT Services, and sample departments. Following theevaluations of suitable audit tools, two products (fPrint 4.02and InControl Audit 2.51, now called Utopia Audit) wereselected. fPrint was allocated for all departments that had only PCcompatibles, while InControl Audit (Utopia Audit) went to departmentswith both PCs and Macs.

Doing a "Walk-Round " Audit

Each participating department was asked tocomplete a questionnaire on the number of staff workstations in usewith hard disks. Diskless workstations could only be audited fortheir hardware and were not included. Also excluded were workstationsin public clusters (e.g. classrooms), as these hard disks tended tobe re-set on a regular basis.

A "walk round audit" is the only realistic firstapproach. Subsequent audits can be done over the network for mostworkstations.

Procedurally, this is how a "walk round" auditworks. A diskette is placed in the floppy drive of each workstation,and from DOS, a scanner program sweeps the hard disk(s) for loadedsoftware and detects the hardware configuration (number of drives andsizes, adapter cards, RAM, etc.). Also completed is an assetquestionnaire that includes serial numbers (CPU box, monitor,peripherals), asset numbers (if in use), user name, phone extension,location and the Workstation Identifier (usually 8 characters as thiswill be a DOS file name). It is necessary to uniquely identify eachworkstation to the relational database "audit manager" to which thediskette's data files will be added.

A form and customised disk label was produced foreach workstation. This allowed dual entry of information to ensureconsistency. Each department representative attended a trainingsession on operating the scanner program. The training also explainedthe reasons for the audit, delivered an anti- piracy message, and ingeneral, was used as an opportunity to publicise good softwarepractice.

Most department representatives adopted aninformal approach of contacting users at the time of the audit. In ITServices, the department was already well aware of the issues. Ane-mail was sent to IT staff asking them to book appointments and tohave serial numbers ready. It should be noted that no additionalstaff costs were incurred; staff undertook the audit as unpaid"homework."

In Practice

Most users co-operated and kept theirappointments. The audit tool took 10 - 20 minutes to run at eachworkstation, with all data files and applications closed down first.Failing PCs were generally aged machines.

The audit was done in batches. While users couldchoose an appointment that suited them, an entire building was stillcompleted in just one or two days. In other departments, therepresentative unlocked the appropriate offices and labs on theweekend. Forms and disks were then returned to one central gatheringpoint for uploading.

Analysis & Reconciliation

Upon uploading each diskette to the database,"recognition" of the software found takes place, matched against alibrary of known applications. Algorithms used vary -- from matchingfile size, date and time, to checksums of executables (definable inthe audit tools). One can "teach" the audit manager new applicationsby adding to its library. This was essential in Exeter's case, as thelibraries supplied recognised only a modest proportion ofapplications.

Existing licence details can also often be enteredinto the audit manager. Licences can then be allocated at theorganisation, department or individual workstation level. Reportsindicate the numbers of applications found and groups or individualsneeding licences.

The alternative to buying more licences is tore-allocate software resources. By tracking software concurrency viametering, a true indication of usage can be determined. Audit toolsoften allow creation of an "investigation" or a "deletion" disk todeal with unauthorised software or to remove items that will bere-installed elsewhere.

Another key benefit of the analysis phase is theability to track suspect material. This could be a directory full ofimage files, hidden directories or even whole archives. An audit toolcan examine every file on a hard disk to check its executable statusas well as capture directory structures, so its "forensic"capabilities are good.

Determining the details of the actual licencesheld depends upon one's recording system. For IT Services, a papersystem tracks site licences sold under educational discounts. Allsales are also recorded, by serial number of the workstation ontowhich they are to be loaded, in the department's accountancy system.By using an intermediate report generator, reports of purchases bydepartment can be captured. One still must import the data into aspreadsheet or a database for cross-checking against output from theaudit tool.

The Future

Exeter is still processing its departmental data,but the audit is hoped to be able to answer questions such as thoseraised at this article's beginning. A base Asset Register will beestablished, which will have links to other systems. Systems forhelpdesks can also be populated from asset-managementdatabases.

By standardisation, support levels for softwarecan be determined and the costs of hardware from different vendorswill be known. Thus cost savings and improved customer service areachieved, even taking into account the cost of the audit tool, thestaff effort required to undertake auditing.

Exeter University is planning to extend this pilotinto a two-year project, and is seeking partners and funding.Deliverables from such collaboration could include greaterinteroperability between related technologies (auditing, softwaremetering, network management) to improve knowledge about what shouldbe installed.

National Initiatives in the U.K.

The U.K. universities' body representing centralcomputing services, Universities and Colleges Information SystemsAssociation (UCISA), has sponsored a Working Party to evaluate andreport upon audit tools. Such tools were investigated, and an onlineversion of the report is available (see references).

A pre-defined feature list was drawn up; essentialand desired criteria were used to assess tool performance. But, whiletwo leading contenders were found, and several "mid range" productsidentified, the price of the programs was seen as a major barrier tooutright recommendation.

Exeter has also created a Web site, the SoftwareAuditing Home Page (see references for URL), which holds White Papersand vendor contact details.


Mark Fletcher is Computing DevelopmentOfficer in the I.T. Services Department, University of Exeter, Devon,U.K. He is a graduate of the Universities of Leicester and Hull, andis the university's Software Auditor. He maintains the SoftwareAuditing Home Page and is owner of the electronic mailing list:software-auditing. In parallel, Fletcher is implementing softwaremetering. E-mail: [email protected]


Products mentioned:
Utopia Audit; Utopia Technology Partners, Inc., Larkspur, CA, (415)464-4500, www.utosoft.com/
fPrint; fPrint UK Ltd., London, +44 181 563 2359
See Software Auditing Home Page for others.

Software Auditing Home Page: www.ex.ac.uk/ECU/auditing/welcome.html


References:
U.K. Universities and Colleges Information Systems Association(UCISA): http://www.ucisa.ac.uk/

Word 6 version of the UCISA Report is at:ftp://ftp.ex.ac.uk/pub/projects/auditing/wprep2.doc

A Postscript version is at:ftp://ftp.ex.ac.uk/pub/ projects/auditing/wprep2.prn

This article originally appeared in the 01/01/1997 issue of THE Journal.

Whitepapers