Monday, 31 December 2007

Privacy - How are we doing in New Zealand?

Just as we get embarrassed here in NZ when our clean green image is tarnished by farmers polluting streams , we should sit up and take notice when we our privacy is not protected in what we assume is a 'free society'. The 2007 International Privacy Ranking from the US-based Electronic Privacy Information Center and the UK-based Privacy International does not present a pretty picture of the NZ attitude to privacy . Overall the rating represents a systematic failure to uphold safeguards. Notably, NZ is up there with the worst, leading in bad practice in communications interception.


The findings are available in PDF format by clicking here.

Tuesday, 18 December 2007

E-Petitions

It has taken a while for me to notice this bit of participatory democracy but I have taken the opportunity to add my name to a petition.

Downing Street is working in partnership with the non-partisan charitable project mySociety to provide a service to allow citizens, charities and campaign groups to set up petitions that are hosted on the Downing Street website, enabling anyone to address and deliver a petition directly to the Prime Minister.

mySociety is a charitable project that runs many of the UK's best-known non-partisan political websites, like HearFromYourMP.com and TheyWorkForYou.com. mySociety is strictly neutral on party political issues, and the e-petition service is within its remit to build websites which give people simple, tangible benefits in the civic and community aspects of their lives. For more information about mySociety and its work, visit its website.

While you are looking, take the opportunity to consider and support this
We the undersigned petition the Prime Minister to require all organisations notify customers immediately of any personal data security breaches.

Thursday, 6 December 2007

Using Intalio to Develop a new Business Process

While Intalio is a BPMS tool rather than an all singing and dancing development workbench, you can deliver a working solution that is useful to a business unit. In this scenario a business analyst or consultant may take a pure business solution approach without attempting to specify system services or enterprise-grade business services. A question that has been raised with me is how do I know that the business process design is sufficiently developed that it is worth investing time and money in building or buying service components. With a tool like Intalio, the answer is when the business process is executable and the business can work through it. That does not necessarily mean that the BA has to solve all the integration issues.

As the BA works through the process, s/he will encounter interactions with services that may or may not exist. In a significant portion of enterprises, there will not be a catalogue of every business service that has been implemented, let alone every one that could be desired. Rather than stopping at each case of need of interaction with a business service, the BA could 'simply' define the interface as the business process sees it and support it with a quick and dirty database (a bit like using MS Access to deliver you operational support systems because it takes too long to get the necessary work done with SAP). Using Intalio, MySQL and a bit of AXIS generation the BA could refine the business process with a working example (I suspect that this may fit within the Agile manifesto). At the point that the working solution satisfies the business for flow, it could be handed over for technical improvement. (integration into the normal pattern of user interface - say MS Outlook or Facebook; and integration into the back office systems of CMS and Finance). For the enterprise, there is a risk associated with this approach ... the initial delivery may become operational and valuable enterprise level information remain hidden from the organisation as a whole (very much as happened with the general use of Access databases).

Jacques-Alexandre Gerber covered this aspect in a post

To summarize, here is how simulation and emulation can be envisioned to be used in order to indeed provide valuable business information before deploying processes in a production environment:

  1. Business Analysts create new process models in Intalio|BPMS Designer
  2. Business Analysts use Intalio|BPMS Designer simulation capabilities to ensure their process models meet their objectives and requirements as far as they can tell.
  3. IT Engineers add emulation processes and deploy them in the emulation environment.
  4. Business Analysts analyze the business reports they get from the emulation environment.
  5. Based on the reports, Business Analysts may revisit their models and go back to step #2. Once they are happy with the business outcome they can truly expect to get, it’s time to actually implement those processes.
  6. IT Engineers now fully implement processes by integrating external systems and users. The next steps are the traditional steps to deploy an application in a production system (test, acceptance, production)
I suggest that step 3 in most cases should not need a propeller-head 'IT Engineer' but a sandpit and toolset for the BA to work with ideas about what the service should look like at that point. Generally, a simple database will do the job for a business process emulation but more exotic plug-ins may evolve in this space (for example instant-messenger presence behavior).

The world does not stand still so the effectiveness of the business process should be measured in production. This is where the BPMS really pays off, as the throughput and utilisation of every activity is automatically gathered and available for analysis. The cycle then resumes at step 2 with improvement in process.

Somewhere in the development cycle, some human-factor engineering needs to take place. If the enterprise has a particular style of working with information "the way we work here", then the BA tool-set could include some helpers here. For example, if the culture is to manage personal tasks through Outlook task lists then providing the task management user interface through an OBA.

If you are faced with inertia in database and enterprise services then building solutions with CRUD services delivered out of the business process delivery seems a good way of establishing what the business requirement is, and what the value will be without having to deal with the triage mechanisms that stand in your way.

Wednesday, 5 December 2007

Open Information v Privacy

There is an increasing amount of personal information being collected for all manner of worthy? reasons like ensuring that health providers do not use taxpayer dollars to treat aliens. Combined with the desire for more openness in government and means to provide data rather than just the results of a conclusion there is a risk of exposure of personal information.
In the paper, Robust De-anonymization of Large Datasets (How to Break Anonymity of the Netflix Prize Dataset), Arvind Narayanan and Vitaly Shmatikov of The University of Texas at Austin describe the problem; show a general method of de-anonymizing statistical data and demonstrate its use in an area where the participants were under the impression that their information was anonymous.
Datasets containing “micro-data,” that is, information about specific individuals, are increasingly becoming
public—both in response to “open government” laws, and to support data mining research. Some datasets
include legally protected information such as health histories; others contain individual preferences, purchases,
and transactions, which many people may view as private or sensitive.
Privacy risks of publishing micro-data are well-known. Even if identifying information such as names,
addresses, and Social Security numbers has been removed, the adversary can use contextual and background
knowledge, as well as cross-correlation with publicly available databases, to re-identify individual
data records. Famous re-identification attacks include de-anonymization of a Massachusetts hospital discharge
database by joining it with with a public voter database [...]

We present a very general class of statistical de-anonymization algorithms which
demonstrate the fundamental limits of privacy in public micro-data. We then show how these methods
can be used in practice to de-anonymize the Netflix Prize dataset, a 500,000-record public dataset.
Collectors and publishers of data need to be aware of the potential for exposure of information that may be regarded as sensitive.
The issue is not limited to widely disseminated information. Individuals or special-interest groups may have legitimate need for micro-data (for example in health funding policy) but then have the means of uncovering personal data for an unauthorised purpose.
Consider:
  • are ethics sufficient to protect the privacy of individuals described by such micro-data?
  • is the information exposed by statistical de-anonymization sufficiently protected by legislation?
  • where would you go for assurance that the data that you are providing is not susceptible to statistical de-anonymization?