Arts, Cartoon

Cartoon: Dexter Wants A Break

Image
Arts, Drama, Screenplay

Body of Evidence: ‘The Burning Question’

SERIES: CRIME FILE INVESTIGATIONS

. Intro & Preamble Note: ‘Body of Evidence’includes cast and personnel list/glossary of terms

A series of crime scenes that will require the reader to apply their forensic skills in solving the mysteries.

Burton walked into the restaurant’s kitchen; its stainless steel and tile surfaces were covered in soggy soot and burnt debris. The sprinklers had been shut off over an hour ago, but the overhead fixtures still dripped steadily. He brought head chef Nathan Olivo in with him, careful to keep the distraught man away from any evidence.

“I hope you like your steak well done,” said Mike Trellis, Burton’s CSI technician. He specialised in arson investigation and bad jokes. Burton laughed, the chef did not.

Trellis was using a fuel sniffer, which looked like a small cane attached to a lunch box, to check areas of the kitchen for traces of accelerant. Petrol and paraffin were the most common, but he had seen arsonists use everything from Silly String to hair spray to start a fire.

“What happened here?” Burton asked.

“It was about half an hour after we closed. We were all in the bar toasting the end of the night when the kitchen just blew up. I started the toast tradition a few weeks ago when we got a mediocre review in the local restaurant guide. The toast is supposed to build morale and create team atmosphere – everyone was pretty down after that review. But the bad food wasn’t our fault, it was the stove.”

“The stove?” Burton said. “Was there a problem with it?”

“Problem? It was a piece of garbage,” Olivo said. “Always burning entrées, scalding sauces and stinking of gas; the pilot light for one of the burners kept going out. I asked the manufacturers to replace it several times, but they refused, saying it was fine.”

Trellis walked over to the blackened stove, the sniffer leading the way.

“Thank you, Mr Olivo,” Burton said, leading him towards the door. “Please step outside with the other employees and we’ll finish up in here.”

Burton shined his flashlight around the kitchen. “The room looks like there was a sudden explosion rather than a slow burn,” he said. “And soot is covering just about every surface in here – walls, counters and especially the ceiling and ceiling fans – so whatever happened, it sent residue everywhere. But what burned in order to make the soot? Soot results from imperfect burning, and gas burns cleanly, with no residue. I can’t believe the kitchen had enough dust to cause this mess.” Burton looked again at the ceiling and the black film covering it. “Wait a minute. Were the ceiling fans on when the kitchen blew?”

Trellis checked his notes. “The fan switch was in the on position, but the explosion knocked out the electricity, so they weren’t spinning for long. The big exhaust ducts up there were off for the night.”

“Let’s try to get a fingerprint off that fan switch,” Burton said. He climbed onto the stainless-steel island in the middle of the kitchen and took a closer look at one of the ceiling fans. It was caked with black soot, as was the ceiling above it. He reached above the fan and ran his finger along the top side of one of the blades. It came back with a white substance on it. Burton smelled it once, then touched it to his tongue.

“Mmm. Tastes like arson,” he said.

How did he know?

– Author’s note: No solution to this case will be made public.

Standard
Britain, Government, Internet, Legal, Society, Technology

New enforceable code for web giants

INFORMATION COMMISSIONER

FACEBOOK, Google and other social media platforms will be forced to introduce strict age checks on their websites or assume all their users are children.

Web firms that hoover up people’s personal information will have to guarantee they know the age of their users before allowing them to set up an account.

Companies that refuse will face fines of up to 4 per cent of their global turnover – £1.67billion in the case of Facebook.

The age checks are part of a tough new code being drawn up by the Information Commissioner’s Office (ICO), which is backed by existing laws and will come into force as early as the autumn.

. See also Internet safety: The era of tech self-regulation is ending

Experts claim it will have a “transformative” effect on social media sites, which have been accused of exposing young people to dangerous and illicit material, bullying and predators. It includes rules to help protect children from paedophiles online.

The code also aims to stop web firms bombarding children with harmful content, a problem highlighted by the case of Molly Russell, 14, who killed herself after Instagram allowed her to view self-harm images. Under the new code:

. Tech firms will be banned from building up a “profile” of children based on their search history, and then using it to send them suggestions for material such as pornography, hate speech and self-harm.

. Children’s privacy settings must automatically be set to the highest level.

. Geolocation services must be switched off by default, making it harder for trolls and paedophiles to target children based on their whereabouts.

. Tech firms will not be allowed to include features on children’s accounts designed to fuel addictive behaviour, including online videos that automatically start one after the other, notifications that arrive through the night, and prompts nudging children to lower their privacy settings.

Once the new rules are implemented, children should be asked to prove their age by uploading their passports or birth certificate to an independent verification firm. This would then give them a digital “fingerprint” which they could use to demonstrate their age on other websites.

Alternatively, the tech firms could ask children to get their parents’ consent, and have the parents prove their identity with a credit card.

If the web giants cannot guarantee the age of their users, they will have to assume they are all children – and dramatically limit the amount of information they collect on them, as set out in the code.

At present, a third of British children aged 11 and nearly half of those aged 12 have an account on Facebook, Twitter or another social network, OFCOM figures show.

Many youngsters are exposed to material or conversations they are too young to cope with as a result.

The Deputy Commissioner at the ICO, said: “We are going to be making it quite clear that there is a reasonable expectation that companies stick to their own published terms and policies, including what they say about age restrictions.”

A House of Lords amendment tabled by Baroness Beeban Kidron that ensures the new code will be drawn up and put into law, said: “I expect the code to say: ‘You may not, as a company, help children find things that are detrimental to their health and well-being.’ That is transformative. This is so radical because it goes into the engine room, into the mechanics of how businesses work and says you cannot exploit children.”

The rules will come into force by the end of the year, and will be policed by the ICO, which has the powers to hand out huge fines.

It will also use its powers to crack down on any web firm that does not have controls in place to enforce its own terms and conditions. Companies that say they ban pornography and hate speech online will have to show the watchdog they have reporting mechanisms in place, and that they quickly remove problem material.

Firms that demand children are aged 13 or above – as most web giants do – will also have to demonstrate that they strictly enforce this policy.

At the moment, web giants such as Facebook, simply ask children to confirm their age by entering their date of birth without demanding proof.

 

FOR far too long, social media giants have arrogantly refused to take responsibility for the filth swilling across their sites.

Many of these firms, cloistered in Silicon Valley ivory towers, are owned by tax-avoiding billionaires who are indifferent to the trauma inflicted on children using websites such as Facebook and Instagram.

At the click of a mouse, young children are at risk of exposure to paedophiles, self-harm images, online pornography and extremist propaganda.

Finally, however, these behemoths are being brought to heel by the Information Commissioner (ICO). They must ensure strict age checks and stop bombarding children with damaging content – or face multi-million-pound fines.

Such enforced regulation is very welcome and well overdue.

Standard