Come join us at ADSEI!

Welcome to all those people who have signed up to this Compute it Simple blog. Please come to,  join the mailing list there and sign up to help our kids become data and tech literate. ADSEI’s mission is to connect all of the teachers who are teaching data science, and all of the data scientists who want to help teachers, and students, become data literate. We offer training in integrating data science with your curriculum – from English, History, and Geography, through to Maths, Science, and Technology.

We are building a community of practice of teachers, data scientists, and developers who want to help our students become data and tech literate. Come join us, and help forge a new, data literate future.

You can check out our Executive Director’s YOW! night talk at and a recent Opinion Piece in The Age, The Sydney Morning Herald, and other Fairfax outlets here:

This website will no longer be updated. Head on over to to get resources and support for teaching data science.




Posted in Uncategorized | Leave a comment

Australian Data Science Education Institute

I am thrilled to be able to launch the Australian Data Science Education Institute.

We are a Not for Profit organisation dedicated to supporting the teaching and learning of Data Science at all levels, and right across the curriculum. We can help you put Data Science into your Science, Maths, and Humanities classes. Check us out at

I am moving my Computing Education blog to the ADSEI site, so check out the blog page and sign up to the mailing list to keep track of the resources, ideas, workshops, and community of practice that ADSEI can help you with.

Posted in Uncategorized | Leave a comment

Precision Oncology

The HPC Impact showcase provided a brilliant talk by Rick Stevens from Argonne National Laboratories. These are my notes – as usual, stream of consciousness, all mistakes are almost certainly mine.

Rick gave an excellent intro to what cancer is and why it’s a problem. I am leaving out the technical details of the DOE/NCI collaboration he spoke about, and trying to focus on what this kind of research means for you.

What is Cancer?

Large number of complex diseases, each behave differently depending on the cell type it originates from, and there are many factors that influence the disease – age of onset, invasiveness, responsiveness to particular treatments.

It is basically abnormal cell growth where the body can no longer regulate growth. It spreads when pieces of the tumour break off and spread to other parts of the body – metastasis.

Cells that start to grow abnormally normally are destroyed by self-inflicted cell death or the immune system, but sometimes cells bypass these death signals and continue to grow.

Cancers acquire the ability to ignore cell growth regulators, avoid the immune system, and grow limitlessly, that make them catastrophic for our systems.

Why is cancer hard? Because bodies accumulate mutations over time and it usually takes 2 or 3 mutations before tumour supression genes get deactivated or oncogenes get activated. They disable DNA repair mechanisms so that mutations occur at a faster rate. As you get older you get accumulated mutations and very complex behaviours. Chemotherapy can work but often some subset of cells will eventually become resistant due to high mutation rates.

Tumours develop drug resistance due to massive heterogeneity. No one drug will target all of the varied cells. So different cells in the same tumour will respond differently to the chemotherapy or other treatment.

Precision oncology is about personalized cancer therapy. The population of cancer patients is not homogeneous and neither are the cancers, so take personal measurements of both person and tumour and apply models that predict how well specific treatments will work for particular patient profiles. Using molecular data to assign treatment based on likelihood of success.

One of the motivators for the DOE NCI joint project in this space is that new cancer drugs on average cost $100,000 per year per treatment, and the average survival improvement is only 6 months. That’s pretty mind boggling.

So the impact of High Performance Computing precision oncology research is to create systems that will take blood tests, biopsy results, and maybe even images from CAT scans and the like, and produce an accurate picture of which drugs are most likely to work for you.

At the moment, despite the ways described above in which individual cancers, and even individual cells within cancers, can vary, our treatments are largely uniform. If you have cancer A you will get drug X. And if that doesn’t work they will move on to drug Y. etc. With these new systems  – which are really close to practical implementation in the clinic – your doctor can feed your test results into the software, and it will spit out the probability that each of the available treatments will actually work.

HPC and precision oncology could save your life. What more proof do you need that #HPCMatters?

Posted in Uncategorized | Tagged , , , | Leave a comment

Smart Cities

As always, these are fast-typed, barely edited, streams of consciousness. No accuracy implied. 🙂

These are my notes from the Smart Cities Plenary.

“Imagine an intersection that can calculate the trajectory of all the vehicles coming at it, and it detects that one of the vehicles is coming too fast to stop, so we don’t change the pedestrian sign to walk or the other signal to green until that car is safely through. That kind of calculation is going to require HPC right at the intersection. You haven’t got time to send it off somewhere and wait for the calculation to happen and come back.” Charlie Catlett on the value of High Performance Computing (HPC) to Smart Cities.

We hear a lot about Smart Cities these days, but what actually are they? I loved this comment from Debra Lam: Smart Cities are the application of technology & data to improve our quality of life.

Or this one from Charlie Catlett that defines the problem beautifully: “How do we take our computational science capabilities and apply them to the challenges that cities have?”

“When you look at a city it is not a monolithic thing. It’s a collection of neighbourhoods.”

Where you live in that city makes a significant difference to your quality of life

You can track respiratory disease and there are significant differences between neighbourhoods. Where you live also impacts on what opportunities you have – eg what jobs are within reach.

We have systems now that detect food safety violations a week earlier than previous systems, by clever use of data.

“We need to start thinking about cities as integrated, complex systems that can’t be studied by just pulling one system out and not looking at it in the context of the entire system, the entire city.”

As Shakespeare said, “What is the City but the people?”

Smart Cities are about making life better for people. By building better spaces that foster communities. By instrumenting our traffic control systems – traffic lights and the like – to prevent accidents and decrease congestion. By improving access to healthcare and education. By predicting future problems such as flooding and making sure there is infrastructure in place to deal with it.

One beautiful example, which I think came from Michael Mattmiller, Chief Technical Officer of Seattle (to me it’s something of a revelation that a city has a CTO!) is of a severe rain event that caused flooding. Using HPC we can now model cities (based on recorded rainfall data) and predict where the flooding will be worst. Then we can send repair crews in to those areas to make sure that any burst or blocked stormwater drains get fixed straight away. That way problems remain minor instead of becoming catastrophic failures.

Smart Cities aren’t about technology, they’re about making life better. As Debra Lam put it: “I’m not starting with the technology, because that means you’re starting with the solution, which probably means you don’t really understand the problem.”

Step 1 in Smart Cities is to identify problems and opportunities. What do people need? How do we make life better? Technology is a path to that. But it’s not the point of the exercise, which people sometimes forget. Technology is just a powerful tool in our quest for better lives.

For people who are interested in City data, Seattle has an open data policy by preference, so there is a treasure trove of data available at

Of course, instrumenting cities and collecting data is challenging from a privacy and civil liberty point of view, so we need to be talking about these issues and working out policies and approaches that work in different cultures. As we were told last night, different places have different cultures and different concerns. “A traffic light is an instruction in Milan, a suggestion in Rome, and a Christmas decoration in Naples.” We have to fit the solutions to the context. The fundamental principles of change management tell us that we need to get buy in – which means communities need to be involved in making their cities smarter. They need to identify the problems and be part of the construction of solutions, rather than having solutions imposed on them.

For more information on Smart Cities, check out Nick Falkner’s pieces in The Conversation.


Posted in Uncategorized | Tagged , , | Leave a comment

SC17 and Quantum Computing

Once again I’m at the SC17 Supercomputing conference with four amazing year 10 students from John Monash Science School. This time we’ve dragged a chem teacher along for the ride. As always I intend to blog and it will be largely a stream of consciousness set of notes. Not very polished, because otherwise they’ll never make it on here! But hopefully useful, or at least interesting. I also don’t vouch for the accuracy or correctness of what I write. What you see is what you get. 🙂

Yesterday we started Supercomputing with a bang at the DWave Quantum Computing Workshop. While some of the material was out of our reach, it was a fascinating introduction to the history of Quantum Computing, and a more realistic estimate of the state of the art than the popular media tends to give.

Although the media often describes Quantum Computing as just around the corner, it’s important to note that we do not yet have what they call “Quantum Advantage” – in other words although we have some small quantum computers, solving a problem on a quantum computer remains slower than solving it on a classical computer.

That’s now.

What will we have in 10, 20, or 30 years’ time?

There are some seriously head spinny aspects to Quantum Computing.

For starters, with quantum systems you can massively increase the size of your dataset without increasing computation time.

I can’t explain how that works (I suspect there really aren’t that many people who can!), but I know that it’s a game changer.

On a normal computer if, for example, you have to search for a piece of information in a dataset the size of a short book, it will take much less time than search for the same piece of information in a large dataset – think, War and Peace. And far longer again – to the point of impossibility – to search in a library sized dataset, or worse, the internet.
On a quantum computer, the size of the dataset doesn’t impact on the runtime.

That changes everything.

For example, much of the tech we use to encrypt things and protect your data relies on the fact that to try every possible password simply takes more compute time than anyone has reasonable access to. We have the tech to hack most encryption, but if it’s going to take a million years on the fastest possible computer, then realistically your data is safe. (Unless your password is “password”, or one of the other not terribly startling variants!)

Quantum computers might be able to hack your password immediately.

Boom. Move over Facebook, privacy really will be dead then.

One of the other interesting things about quantum computers is that they don’t necessarily give you a single answer. For example, if you are looking for the best route to the airport for the whole taxi fleet in a city, it’s possible to use DWave to give you a set of good answers. Not one best answer, but a range of answers that work. You can see how that might be better – because it’s faster, but also because the one best answer might be out of action suddenly, due to an accident.

If using classical computing to find the best answer will take hours, but a quantum computer can give you 1000 pretty good answers in moments, that’s a step forward.
There are some applications that currently run on quantum computers, but for the most part things are so experimental that they’re not producing solutions that work right now. But in exploring these systems, learning their limitations and their advantages, people are finding new problems that quantum computing might provide radical new solutions to, and finding new ways to work with these systems. So although the quantum computers we have right now are relatively small, and not yet faster than classical computers, the more we play with them, the readier we will be to jump on the bigger, more powerful computers as they come online.

One of the things I find really interesting about the quantum computing industry is that it is a deep investment in what has at least in the past been a very speculative area. It flies against the trend I see so much of in science and technology these days where the funding all too often comes with explicit requirements for immediate payoffs.

We heard from John Sarrao from Los Alamos that they are deliberately investing in a kind of “playtime” on their DWave machine in the certain knowledge that it won’t solve immediate problems, but that it is setting them up to get smart people working together and forming communities to think and explore and find a whole range of possible futures. More than all of the quantum computing technology we learnt about, this is the part I found really exciting. Back in the heyday of Xerox Parc this kind of investment in an as yet unspecified future was extraordinarily productive, and we need far more of it.

This is how you create innovation – by giving smart people the chance to play, and the opportunity to collaborate. Not by stamping your feet and demanding that people innovate!

That’s just a small slice of the ways my brain was exploded yesterday. Let there be more of it!

Posted in Uncategorized | Tagged , , , | Leave a comment

The things we put up with

During the week our handbasin drain blocked. I squirmed around underneath it, taking apart the various bits of pipe we have access to, and really struggled. Some bits were so tight they were difficult to undo. Some bits, once they were undone, were incredibly awkward and difficult to get back on straight, so that I could barely do them up once I was finished. And overall it was really difficult to get at these pipes at all.

It turned out that there was a heap of muck in that pipe – probably hair, to start with – that had blocked it up. Why was it even possible for that stuff to get down there? Why hasn’t pipe technology changed much in the last 100 years? Ok, so we have plastic now. That seems to be where the changes end. My great grandfather would probably not see anything particularly different in our plumbing to the pipes he was used to.


It dawned on me then that there is an awful lot of technology in our lives – and make no mistake, pipes are technology, if somewhat primitive – that we simply put up with, despite its issues. The things that are difficult to undo. The things that are difficult to get back on straight. The things that require us to contort ourselves just to use them, whether mentally or physically.

We spend a lot of time adapting to technology.

Why doesn’t technology spend a lot of time adapting to us?

Programming is a case in point. I am incredibly lucky this year to have taught programming to my year 10s with the help of two physics teachers. Yesterday, when I was teaching kids about string slicing – where s[4:6] gives you the 4th, and 5th characters of the string, not the 6th! –  one of them looked at me with a frown and said “Why does it stop at 5? Why doesn’t it give you the 6th?”

I was about to launch into the standard explanation of starting at 0 and finishing at length -1, so you can get a slice that goes to the end by saying s[4:len(s)] when I realised that, in this case, that’s actually completely unnecessary, because if you leave the second parameter blank, Python goes to the end anyway.

But then I realised I was thinking too small. Why do we start at 0 at all? I know the arguments about distance from the start, memory use, blah blah blah etc etc, but they have very little relevance to modern programming. And off-by-one errors are hugely common. The whole starting at 0 thing trips novices up all the time, and it also trips up experienced programmers from time to time. It’s actually a tricky thing to fit into your brain, because it’s contrary to the way we count, as human beings. If I ask you to give me the first item, you’ll call it item 1, not item 0.

I often jokingly say that programming languages start at 0 because Computer Scientists are a little bit odd, and they want to make life hard for you. But now I am wondering whether that is, actually, just a joke.

Why couldn’t we design a programming language that is genuinely easy to use? That corresponds better to our understanding of the way the world works? That contorts itself to us, instead of us contorting ourselves to it? There are so many language features that are common sources of errors. What if we could fix them?

This, to me, is another reason why we need diversity in tech. Because we need people like my Physics teacher friends to look at it with fresh eyes and say “But that’s just dumb!”

Posted in Uncategorized | Tagged , , , , , , | Leave a comment

As easy as pi?

The new Digital Technologies Curriculum in Australia means that schools around the country at every year level from year 10 right down to teeny tiny preppies (I swear, they get teenier and tinier every year) have to shoehorn technology into their already crowded and frantically busy class time.

This is pretty challenging for teachers with no training in teaching tech. So for those of us with tech skills, and more importantly, those of us with educational tech skills, it’s really important that we are as supportive as we can be of those who haven’t got the skills but are genuinely committed to giving this whole tech thing a jolly good go.

And it worries me – it worries me a lot – that there is a loud and, to me, inexplicable, message out there that hardware is a nice, easy, friendly way in to the tech space.

Grab some Arduinos. Tada! Tech just happens.

Grab some Lego robots. Tada! Tech just happens.

Grab some raspberry pi boards. More magic tech materialises out of nowhere.

This is both a triumph and a desperate failure of marketing.

These things, despite their marketing, are not easy to use.

They require significant tech skills to master – or a huge amount of time, and trial and error.

Sure, they are fabulous for the kids who are heavily into this sort of thing and prepared to spend forever bashing their heads against a keyboard and a soldering iron in order to make things happen.

But for the kids who aren’t really into this stuff and need to be persuaded, they can be massively off-putting. For the teachers who have to support the kids who aren’t really into this stuff, they can be even worse.

I am co-supervising an honours student at the moment by the name of Jarred Benham who is looking into the usability of these kits. He has surveyed teachers who use them (If you are a teacher, you can fill out the survey here), and I won’t gazump his results except to say that teachers tend to buy these kits with great optimism, and then find them confrontingly difficult to use in the classroom.

This doesn’t surprise me. The first time I sat down to use the Lego Mindstorms software with an NXT2 robot I was shocked to find how bizarrely difficult it was to use. Lego has a justifiably great reputation for its block kits and its instruction books, but when it comes to Mindstorms it has failed to live up to that reputation in a fairly spectacular way.

I give you, as exhibit A, the action blocks from the Mindstorms software:

Screen Shot 2017-08-15 at 4.27.48 pm

I haven’t the faintest idea what they mean. This image was taken from a page headed: “Learn to Program! It’s easy!” and I suspect the only message a beginner is likely to take away from this is “not for me“.

Let’s look at Arduinos for a moment.

Screen Shot 2017-08-15 at 1.16.05 pm

Heavily marketed and widely touted as being easy to use, Arduinos actually require significant tech skills to setup and get working. The website says “The Arduino software is easy-to-use for beginners.”

Allow me to show you the first, and possibly simplest, bit of sample code, direct from the Arduino website:

Screen Shot 2017-08-15 at 1.18.49 pm

Simple, right? Sure, if you’ve programmed in C before. And if you understand the meaning of the words Analog and Serial. Also if you know what a potentiometer is and how to find 5V and ground. And what the heck a Serial Monitor is.

Well… I mean… who doesn’t? Ahem.

But the marketing is so powerful that when I went to the website in order to research this article, and read all the stuff about how easy to use it is, I figured it must have changed since I last tried to program an Arduino – just last year. But no. It’s the same, high entry level, learning cliff. And yet the message “arduinos are easy to use and a great intro to tech” is extraordinarily pervasive.

Interestingly, when I googled “is Arduino easy to use?” I got a large number of hits that all used the exact same words as the official Arduino website. “Arduino is an open-source electronics prototyping platform based on flexible, easy-to-use hardware and software.” I’m sure their marketing department would be thrilled.

As for Raspberry pis, they are literally just small, cheap computers. There is nothing special, or particularly easy, about them as an introduction to tech. But again, they have a reputation as easy to use and great for beginners.

Now, as electronics kits go, Arduinos may well be on the easier end of the scale – I don’t know, having not conducted an exhaustive evaluation of all the kits out there – but it’s NOT an easy intro to tech for someone without tech skills. In fact, I’ve yet to see a hardware kit that is.

Even if you ignore the poor usability of the various interfaces, hardware has other drawbacks. There’s maintaining the kits, and dealing with loose connections, dead batteries, and sensors that inexplicably stop working. There’s software upgrades that leave older hardware for dead (Looking at YOU, LEGO MINDSTORMS!). And then there’s the sheer cost of buying class sets that are often not robust enough to withstand troupes of 30 eager young people at a time, giving them a hammering all day every day.

This worries me, because those teachers I talked about who want to give this tech thing a red hot go? They’re going to get burned on the deal if they believe that hardware is an easy and fun intro. It’s going to be a lot of pain and trauma getting it going, an even larger amount of pain and trauma keeping it going, and very quickly the kits will become obsolete or too broken to keep using.

When I went around talking to the primary school kids at Young ICT Explorers on the weekend I asked them what it was like learning to build their projects. The ones who used hardware all said “Oh it was really hard.” Is that the message we want to send about tech? That it’s really hard? How many kids (and teachers) are we scaring away with our insistence that these kits are easy to use when they are manifestly not? One of the things that happens when you are told something is easy to use and it’s not is that you assume it’s your fault. That you’re no good at this stuff. That it’s too hard for you. It’s incredibly destructive.

Part of Jarred’s project is to create a website that will help teachers choose the best kits for their purposes based on what other teachers have found. I can’t wait until this website is ready to go public, because I think it’s going to be an incredible resource. But in the meantime I think we should be asking whether using hardware in the classroom actually stacks up in a cost-benefit calculation. Is it worth the pain?

I don’t think it is. But if teachers choose it, at least we can help them choose it with their eyes open.



Posted in digital technologies curriculum | Tagged , , , | Leave a comment