Recently, I had the privilege of reading the first chapter of my latest novel, Fluence, at Novel London – a literary event with an intimate audience that’s held once a month in different venues around central London. Take a look at the photo above and the video below to get a sense of the location and the lofty position the authors occupied.
It was a significant evening for me in many ways. Partly because it’s the first recording of me reading from Fluence, but also because it was held in the St. Pancras Clock Tower which used to be a dilapidated building and top of my list of places to squat.
Whenever I read in public it always strikes me that although I love reading to an audience, I enjoy signing books and chatting afterwards as much and this was no exception. All in all it was a great event and the readings, the tower, the wine and the audience all added up to a friendly and enthusiastic evening. What more could you ask for?
I hope the first chapter will give you enough of a taste to make you want to read the whole book!
Through the windscreen she saw a group of children crossing the road slowly, sliding around on the ice.
The driverless car wasn’t braking.
She’d forgotten to ask the hire company if this model was programmed to prioritise pedestrians or passengers. It would make a choice – her daughter or the kids playing in the road – but she didn’t know which one.
She could override it by taking control of the steering. But, her driving ability was far inferior to the car’s algorithm. She glanced at the rock face on one side and the cliff edge on the other.
I came across these two stories last week – there’s an algorithm that can detect deceit in your social media feed and Twitter has been telling people they don’t exist.
This led me to ponder what it would be like to be in charge of a social media company with a conscience.
Imagine you’re uncomfortable with providing a platform from which people tell lies that are stored for future generations as the accurate record of our social history.
If your algorithms can detect deceit and detect it more effectively than human beings – that’s the claim – then would you consider it your moral duty to find the lies and delete them all? Of course you’d have to trust the algorithms, and their creators, to not deceive you.
Would you delete everything that appeared to be a lie, no matter how big or small?
I wonder if Twitter is temporarily suspending accounts while it cleanses them.
Have you checked your social media history recently?