Sora, Dark Data & Vision Pro Returns

5 in 5 - Brave & Heart HeartBeat #191 ❤️

This week we’re talking the AI video generation to end all AI video generation, why AI won’t help solve your dark data issues, and why so many people are returning their Apple Vison Pro headsets.

Plus, can a parent’s WhatsApp group help get smartphones away from kids, and is “helpful” recruitment technology blocking the best candidates?

Let’s get into it.

Were you forwarded this? Not a subscriber? 👉 Sign up here


#1 - AI Video Generation Is Finally Here

Because OpenAI says it is.

Athough text to video generation with AI technically already existed, the comparison between the AI videos created by Meta and Google and those created by OpenAI are so worlds apart that we can basically agree that it’s just been invented – because whatever those were, they weren’t this.

The videos generated by OpenAI’s video generation software Sora, Japanese for sky, can easily pass for real filmed footage at first glance, and the levels of detail are pretty amazing.

At least, the videos shared by Sam Altman that he says Sora created are amazing, the average person can’t test it out for themselves just yet.

OpenAI announced that their new text-to-video diffusion model is now available to red-teamers – the experts who test new technology for harms and risks, i.e. to see if Sora will be able to bypass certain rules. The same rules used by DALL-E to reject “harmful or inappropriate” text prompts will be in place for Sora.

The announcement also says that Sora will be made available to a “select group” of visual artists, designers and filmmakers in order to “gain feedback on how to advance the model to be most helpful for creative professionals.”

Will this newest OpenAI innovation be a tool to allow creativity and creation to blossom, or a danger to creators and everyone ese?

I Guess We’ll Find Out



#2 Apple Fans Return Their Vision Pros

Yes that’s right, the two-week return period has come to an end and many of Apple Vision Pros early adopters – those who bought the headset right out of the gate – have made the choice to return them.

Reasons given include the weight and difficulty to wear, along with eye strain, motion-sickness and headaches, along with the fact that what it delivers simply isn’t worth the $3,500 price tag.

One consumer described the headset as being as “magical to use” as they had hoped it would be, but unfortunately couldn’t deal with the discomfort caused by using it, in their case including headaches and eye strain.

They did say, however, that they would be “back for the next one” – assuming that Apple will level up the product in the years to come, as they have been known to do.

Another fan tweeted (we’re not going to say Xed we’re just not going to say it, okay) that while they thought the Vision Pro was the most “mindblowing piece of tech” they’d ever tried, they “couldn’t wait” to return it. Again, headaches.

Others returned the headset due to a lack of potential uses in line with the high price tag. For example, graphic designers and coders didn’t feel that they could carry out their work using the Vision Pro, and called issues like fuzzing around windows and difficult file management processes “productivity killers”.

Spatial Computing, then, doesn’t seem to be the deal breaker Apple had originally sold it as. Do the first-wave of buyers/returners speak for everyone? Will there be a second, better version?

We can only wait and see, but it’s not looking good for the Vision Pro.

Damn Pesky Eyeballs


#3 - Dark Data And AI

A winning team? More like a losing team.

There’s no denying the need to record certain data, and technological advancements have made it easier than ever, but we’re now confronted by a new, very modern, very real problem – Dark Data.

Dark Data is data that is used once, then forgotten… but kept in storage. The problem with that? Energy use and, most importantly, carbon emissions.

Not to mention the cost for the businesses storing the data. Amazon Web Services report some cases where 80% of on-premise primary business data hasn’t been accessed in years, yet they’re still consuming power to keep that data available. In some cases, they’ll also be paying through the nose to offset the carbon that that power consumption is creating – talk about inception.

AI is the answer to everything right, so surely it can help here? Wrong.

If you’re not careful, AI can actually make the problem a lot worse. Professors Hodgkinson and Jackson at Loughborough University, who we worked with to create a Digital Decarbonisation model for businesses looking to cut their carbon emissions, have emphasised the important of thinking about data management practises when looking to use AI in a new article.

As huge amounts of data are required to train AI models, the storage of that data needs to be considered to prevent the environmental impact of unchecked AI data.

They urge businesses to think about adopting data management strategies that prioritise “data minimization, efficient storage and responsible data disposal” to reap the benefits of AI without increasing their ecological footprint.

As always, we couldn’t agree more.

Wonder How Much Carbon OpenAI Is Producing…



#4 -  A Smartphone Free Childhood

A memory for some, a seemingly impossible dream right now, and the name of a WhatsApp group started by two parents in the UK which has exploded in popularity this week.

The group was set up by two female friends with primary school children in response to their fears around kids using smartphones and worries about how it’s become the “norm” to give children a smartphone device around about when they start secondary school.

The idea was that nobody wants their kid to be the only one without a smartphone, but if they banded together with other parents who felt the same maybe they could push the dreaded moment back a little bit.

If you’re the minority without a smartphone, that’s difficult, but if enough parents join the movement then that peer pressure will no longer be the huge issue that it once was.

The group founders were hoping for 14 to be the age their children get the phones, with social media access being held off until 16.

The group, unexpectedly for the founders - who thought they must be in a minority - reached the 1,000 person limit within 24 hours. The pair encouraged people to create local groups, and within half an hour 30 such groups had sprung up across the country, reaching 4,000 people and expanding rapidly.

Many commentators are currently alerting us to the dangers smartphones and social media access pose to their children, and parents seem to be getting the message.

This might actually work…

Safety In Numbers


#5 - AI vs. Candidates

While AI-powered recruitment tech was thought to be revolutionising the industry and getting rid of potential for human bias, and it certainly has saved some firms money, it may also be keeping some of the best candidates out of roles they’re qualified for.

Experts say that on the bias question, there hasn’t actually been that much proof that AI tools are less biased than human interviewers and recruitment professionals, and that if they are in fact problematic, they could be doing a lot more harm than good.

Not to mention on a much larger scale, considering that over 40% of businesses are reportedly using AI in their hiring processes.

One example given is that while one biased hiring manager can do a lot of harm to a lot of people over one year, that’s nothing compared to what an algorithm that is potentially being used by hundreds of companies can do - affecting up to thousands of candidates.

Take one high profile case from the UK - after being furloughed during the pandemic, make-up artist Anthea Mairoudhiou was asked to re-apply for her role in 2020, this time being evaluated on past performance and via AI-screening programme HireVue.

While she ranked well on skills, she was out of a job after the AI tool scored her body language negatively. HireVue went on to remove their facial analysis function in 2021, but too late for Anthea.

As one commentator notes, the fact that the software has saved companies a lot of money doesn’t incentivise them to change their use of them right now, but maybe in the future when they find their business is missing the best talent they’ll take another look…

Don’t Look Shifty During Your Body Language Evaluation


Brave & Heart over and out.

Bonus 

AI Apartments

Remember House Doctor?

Another job replaced by AI, because why bother decorating a flat you’re trying to sell when AI can do that for you?

Home hunters have found some fishy looking apartments on estate agent’s websites in which fake AI generated furniture has been added in order to dress up the space.

Obviously it’s completely fake, and probably doesn't take measurements into account or reflect what the place would actually look like decorated.

Ingenious or devious? A bit of both probably.

Apartments In The Uncanny Valley


To find out more on how you can retain your top talent, or how we can help you with digital solutions to your business and marketing challenges, check out our case studies.


Previous
Previous

Woke AI, Tinder Pigeons & TikTok Outrage

Next
Next

Lunar New Year, Tech AA & Life Coaches