Fyma was founded by myself and Taavi Tammiste in 2019. My passion has always been in film and TV, and my career even took me to Dubai where I worked on Hollywood blockbusters like The Fast and the Furious and Star Wars franchises. I’ve also held a number of business leadership roles, so becoming a co-founder and CEO of a tech firm that specialises in transforming cameras was a good decision, combining my passions in a new way.
Taavi’s background is in telecommunications, cyber security and AI. He’s already built four AI tech teams, and Fyma is his second startup. I like to think our skills and personalities are a great match. And, we’re supported by a team of great investors and a growing number of colleagues who believe in us and our product.
We’ve created an AI computer vision platform that easily and quickly plugs into cameras – CCTV, IP, webcam, NVR – and turns standard camera feed into untapped insights about people, their behaviours and other objects. It means real estate managers, urban planners, and data analysts get new, valuable insights when it comes to re-designing and planning for better customer and citizen experiences, safety, comfort, and revenue generation.
How did you come up with the idea for the company?
I’d say we came up with the idea by encountering some common problems that really need fixing. We’re problem-solvers for our customers, and that keeps us motivated and passionate.
First, there’s close to a billion cameras globally, most doing not much more than passive observance or basic counting of people, much like sensors. Camera set-up, maintenance, upgrades, and connectivity comes at a cost, too. Taavi had experience of doing AI consulting around cameras and sensors for traffic junctions, and realised how slow and costly it was for the customer.
With the Fyma platform, we’ve found a way to get more ROI for customers’ investment in cameras and related infrastructure. Imagine a large real estate management team or local government road and traffic team being able to transform their cameras into AI-generated data insights within minutes.
The other problem is around consumer and citizen privacy. A lot of negative media and concern arises when we talk about AI and facial recognition, and rightly so. It’s too easy for facial recognition to be misused in biased ways, including racial and ethnic profiling. And, in most environments, it’s hard to do, and there’s no commonly agreed taxonomy of the human race anyway.
So, we’re pioneering a new, more ethical type of AI computer vision, one that does not detect, understand or process human faces, ever. We’re giving customers deep, valuable data insights without infringing on personal privacy and profiling concerns, which we’re completely opposed to. It also means our AI is leading the way in getting more insights from other human behaviours, movements, and characteristics that don’t infringe privacy.
More from Interviews
- Meet Anna Marcovici, Founder at Edible Beauty Supplement: Manifesto Beauty
- A Chat with Lucas Martinez, Co-CEO at Online Job Search Platform: Talent.com
- A Chat with Brad Wilson, CEO and Founder at Blockchain Corporation: NuPay Technologies
- A Chat With Eric Huttman, CEO at Independent FXaaS: MillTechFX
- Meet Dr Rumina Taylor, Chief Clinical Officer at Online Therapy and Coaching Platform: HelloSelf
- Meet Rafael Rozenson, Founder at Protein Brand: Vieve
- A Chat with Lindsay Forster, CEO at Smart Asset Checking Service: Shepper
- A Chat with Mark Walker-Smith, Head of Business at SaaS-Based Platform: Sekel Tech
How did Fyma develop this newer, more ethical AI?
It starts at the training stage. AI computer vision algorithms need huge volumes of data – in this context, images – to learn about humans, objects, modes of transport, and behaviours. So, at the training stage we blur out human faces, all the time so that the algorithms never see or learn what a human face is. Even our data science team doesn’t see the human faces on training images.
So, when it comes to the real-life deployment stage, the platform receives the camera feed but does not detect or classify human faces, ever. And, no camera feed data is stored on the platform or by Fyma, it’s all deleted within seconds of passing through the platform, with just the metadata saved. We also maintain high levels of GDPR compliance across our tech stack, processes, and business partnerships.
How has the company evolved during the pandemic?
The demand for a more data-driven approach to managing commercial spaces, crowds, public transport and road traffic has grown significantly. We’ve worked with major shopping centres across the Baltics and Nordics, local governments in Latvia and Estonia, and British mixed-use leisure, retail and business parks, all looking to better understand how consumers and citizens are using roads, shopping centers, venues and public spaces, and how usage has changed during and after Covid lockdowns and social distance measures. The insights Fyma is delivering are helping our customers to create more accessible, safer, and smarter experiences, and plan for future needs.
What can we hope to see from Fyma in the future?
We’re going to stay focused on our mission to solve the problem of getting better ROI from cameras, doing AI computer vision in an ethical way for our customers, giving them the insights they need to create the best experiences for their customers and citizens, and getting the best insights their businesses and organisations need for future planning.
And, as a tech company, we’re always innovating, looking at new ways of working and new things to build that meet the needs of current and future customers. I don’t want to give too much away, but the future might include a new product offering for individual consumers – think cars and homes, for example. And, there’s businesses who have video footage from sources other than cameras, so we will shortly be launching a post-processing product to support those businesses to reap the benefits of Fyma’s AI computer vision insights even if they don’t have any or many cameras to draw up. We’ll be sure to keep Techround and its readers updated.