AUTOMA is the new open-access marine monitoring platform developed by 20tab together with EdgeLab, Superfici, and Pelagosphera, and funded by the European Union as part of the National Biodiversity Future Center (NBFC). The project combines underwater robotics, artificial intelligence, and scientific research to recognize and monitor invasive alien species, assess their impact on ecosystems, and support timely decisions by institutions and researchers.
Client
National Biodiversity Future Center (NBFC)
Industry
Blue Economy - Scientific Research
Year
2024-2025

Before AUTOMA launched, monitoring invasive alien species was based on fragmented processes: heterogeneous data collection, very slow manual analysis, and difficulties in consolidating information from AUVs, divers, and researchers.
The main critical issues were:
The stakeholders involved - marine biologists, research laboratories, technology partners, and marine protected areas - had clear objectives: to make monitoring more scientifically robust, reduce manual workload, engage citizen science communities, and build a platform that could grow over time.
The challenge wasn't exclusively technical. It required a strategic vision capable of combining scientific needs, operational complexity, institutional objectives, and a rigorous approach to data quality.
Offering
Data & AI Solutions
UX/UI Design
Product Development
The project began with a discovery phase to fully understand the scientific context and the needs of the various stakeholders. Through interviews, technical evaluations of datasets, and initial artificial intelligence tests, 20tab clarified the critical points of existing processes and defined a shared user story map, the basis upon which the roadmap was built.
The design phase translated these insights into concrete flows, prototypes, and initial versions of the dashboard, developed and validated iteratively with marine biologists and technical partners. In parallel, 20tab also oversaw the project's visual positioning, defining naming, graphic identity, and presentation materials for dissemination.
In the first quarters of 2025, the core system development phase began, creating the infrastructure, authentication mechanisms, and a first complete image upload and consultation cycle, along with a fully navigable dashboard. At this stage, a first automatic processing pipeline based on computer vision was also activated, designed to accelerate classification and reduce manual effort.
In the final phase of the project, AUTOMA accelerated on three fronts.
Tools and Practices
User story mapping
Customer journey map
User flow
Lean Canvas
Jobs to Be Done
Wireframe
Iterative prototyping
Continuous feedback loops
One of the core elements of AUTOMA is work on machine learning applied to species recognition. The goal was not just to automate, but to build a scientifically reliable system: well-structured datasets, consistent metadata, and clear validation processes so that the AI could be trained and improved over time.
In recent months, the project has strengthened its data quality and traceability activities: naming and cataloging standards, georeferencing, and a dual level of verification (technical and scientific) to ensure taxonomic accuracy and photo/video data integrity. These protocols have also been implemented within the platform, improving metadata management and making validated data immediately usable for model training and development.
The result is a supervised and transparent approach, where AI supports the analysis and experts validate the outputs, creating a continuous cycle of improvement.
AUTOMA is now a robust, expanding system based on a framework that enables rapid, iterative, and scientifically rigorous development.
The main benefits:
The evolution doesn't stop there: AUTOMA will enter new development cycles, including predictive models, advanced data analytics, and integrations with underwater robotics and citizen science communities.
Lessons learned
This project highlights how well-planned technical work can become a true driver of transformation, especially when accompanied by a strategic vision and continuous development cycles.
Experience has shown that constant dialogue with stakeholders allows for more informed decisions and keeps the product aligned with real needs, while transparent and supervised use of AI guarantees reliable and scientifically sound results.
We've also found that building a robust architecture from the start accelerates system evolution and makes costs more sustainable. At the same time, well-organized process and communication management helps manage complexity and keep highly diverse teams aligned.
Want to learn more about this project or build a similar system for your research domain?