Recent Posts



Local Government, Data Analytics & Smart Cities

Cities around the globe are bringing data analytics to local government.

In the same way companies have been embracing data to drive business strategy so too can local government use data to drive local government strategy. In essence this centers around delivering a better life for its citizens and is the backbone of the 'Smart Cities' movement.

By utilising data, local governments will have evidence of defensible data-driven decision making.

Data can also be used to reduce undesirable elements such as pollution and crime.

Data can also be used to attract investment into the area.

Here’s how you get started

1) Start with a question or a problem

You can go here for some examples:

Don’t initially go for the “do all the things, collect all the data” approach. Just like a business local government should have a strategic approach to data collection and it should start with a question or problem to be optimised. It could be something very simple “Can we have less traffic congestion in the CBD?”.

In Australia if you are a local council and you want funding to kick off a smart city initiative you need to have a clearly defined goal. “Collect the data for all the things” isn’t really going to cut it here.

2) Collect the data you need to answer the question

This is often difficult, as the data exists often in disparate data sources, it is siloed, it is painful. Anyone who tells you wrangling data is easy is trying to sell you something. That’s why it is important to start with the business problem and use it to drive your data collection and interrogation efforts.

There is a process of getting the IT infrastructure up to speed to support data analytics initiatives. There will be legacy systems, errors from manual entry, Excel being used as a database, and so on. This will impact delivery of data analytics projects in Local Government.

3) Build internal data capabilities

Investment in the people to combine and analyse the data will pay massive benefits. In the case of New Hampshire for each $1 they invested in analytics they had a $91 return. The costs savings and opportunities from building internal data capability are huge.

Local Government may of course need expertise from time to time, or in establishing best practice as they are mostly fairly immature on their data journey. Ultimately the goal should be to do it themselves and not be tied to a single vendor.

4) Start simple

Create the dashboards and KPIs to monitor how the problem is tracking. Maybe use a proxy such as parking meter data to produce reports tracking congestion over time, with the aim to encourage more people to use public transport.

From there start to analyse past data for trends, and ultimately use statistical models to determine factors likely to have influenced traffic congestion in the past. Understanding the problem helps to develop solutions.

Over time you can step forward to be more preventive or proactive with more sophisticated automated systems, but at the start of the journey for a local government there is a lot to be gained from keeping it simple.

Here are some ideas to get you started

  • You can take a data-driven approach to proving that events in the region are attracting visitors (my first data science project for local government was exactly this problem!)

  • Geospatial analysis for fire, flood, crime etc planning

  • Fleet management dashboards

  • Mapping of crime hotspots

  • Parking sensors

  • Optimising restaurant inspections

  • Fraud detection

  • Repairs and maintenance of local government assets

  • Releasing Open Data to assist private companies to develop solutions in partnership with local government

  • Analysis of payroll and overtime to understand where staff and resources are needed

  • Water quality analysis

  • Flood risk analysis

Below are screen shots of some quick examples I put together using open source data.

Please note you can do SO MUCH MORE than what I have done here if you are hosting these reports on premises.

These dashboards are interactive when they are on a live site. This means the user can play with toggles, allowing users to understand the data better. This allows users to slice, dice, drill in, understand and interact with the data.

These reports democratise data in the organisation allowing better data-driven decisions.

Mapping of Cyclone Oswald Damage Around Bundaberg Queensland



Waste Management Collection Costs by Local Government Area Queensland



As you can see Gympie looks like a data error at $3 to service each bin? The total cost of servicing seems right, so it is probably the total number of bins that’s off.

It is worth noting that Power BI have linked up with Esri to provide spatial layers for household income, storm surge etc in Power BI. So for an extra few bucks per user per month you can access these layers.

It is amazing that Power BI is available with a Microsoft subscription and is so powerful. It lowers the barrier to entry for local governments doing geospatial analysis.

Melbourne Parking Violations Analysis


A decision tree or random forest in python using features like time of day, day of week (extracted from the timestamp) and of course street can be used to optimise parking inspections using this dataset of over 13.5 million records. This might even be over-cooking it a bit for most councils, however you can still pull out valuable insights without going down the machine learning path!

To me looking at this dataset the little, quieter streets had a much higher rate of violations from people overstaying the spots. A quick exploratory analysis with the data showed the following rates of “in violation” by street:

So the percentage of “in violation” stays was highest in the quieter streets! Now that’s interesting!

I have never, ever opened up a dataset and not found something really unexpected like that, it really is fascinating stuff. Even this simple data exploration has to be performed in python, 13.5m rows is more than Excel can handle!

That was literally only a few lines in python!

import pandas as pd dat = pd.read_csv("Parking_bay_arrivals_and_departures_2014.csv") summary = dat.groupby('StreetName"['InViolation'].agg(['sum','count']) summary ['perc'] = summary ['sum'] / summary ['count']

So, plotting of data and exploratory analysis can go a long way to help local governments improve the lives of the people they serve.

If you want to know more, or need assistance getting the most out of your data you can:

Shoot me an email

Give me a bell: 0413 743 856

Connect on LinkedIn

Check out my website

Fill out this form


A great report on how Cincinnati brought data analytics into local government

More datasets and kernels on open government data

Thoughts from a guy visiting local governments around England

Ways councils can get more from their data

How to do data analytics in local government, an action plan with examples

Examples of local government dashboards using Tableau