SmartSprout is a validation project in the S3Food project applying smart vision technology to the trimming of brussels sprouts, using sensor-integrated visual inspection and sorting. That way SmartSprout intends to optimize the key priorities for food processors: resource efficiency and quality control. The project aims to reduce waste and manual labour and to generate reliable data to secure consistent food quality and traceability.
Using a solution based on a ‘sproutcam’, sorting carrousel and trimming machine, the objective is to minimize food waste, ensure uniform quality and increase processing capacity with less labour requirements. The system will also provide detailed data on food safety and process performance.
TechNature – a producer of high-tech solutions for the agri-food market and the lead SME- anticipates the system will benefit processing companies, retailers and consumers. TechNature is collaborating with Bac Investments, which processes agricultural products at facilities in Poland. Both partners are based in The Netherlands.
Over the last few years the production of dessert company Dulcegrado has been growing steadily. Now it will analyze the current production system and evaluate the effects of possible modifications in the production process, such as the employment of a HTST pasteurizer.
This is a cross-sectoral collaboration initiative, between companies of Clusaga and ASINCAR, including Dulcegrado, AOTECH and the Galician SME Glaucor.
The project, which has been awarded an S3Food application voucher of up to 180,000 euros, consists of the development of a system to automate the collection of data in the process of dairy production, energy supply and quality control, which will be analyzed on an Internet of Things (IoT) cloud platform.
Through this system and with the introduction of Near Infrared (NIR) technology, a characterization of the production can be obtained, in terms of energy costs and performance, to optimize the use of ingredients and achieve savings in the use of these.
In the food production industry it’s key to detect production issues and bad products. The longer detection takes, the more production time is lost and the higher the cost of production.
Within the S3Food project NDUX -a company in Haaltert, Belgium- will create a framework using a data-driven process monitoring tool. With this tool they will allow food production companies to have more control of their own production processes.
With this framework a company gets insights in how production monitoring can be done and which parameters are relevant. This way a company can take the needed steps to go from reactive production monitoring towards data-driven production monitoring.
By using the framework for their own production processes, this project allows each production company to improve resource efficiency, quality control, food safety and traceability challenges.
Doroti Pack, a Hungarian packaging machine specialist wants to improve the working conditions for operators of packing line automation systems of chicken wings significantly. ImProWings is the name of this S3Food funded project. Doroti Pack aims to achieve several goals in one by developing a new packing line automation system. In addition to improving the working conditions for the operators, Doroti Pack wants to reduce energy consumption by 30% and cut 20% off packing costs.
Working with applied research partner Bay Zoltán Nonprofit Ltd, Doroti will draw on computer vision software and AI to determine the position and orientation of chicken wings on the line. This data will then be applied to coordinate a robotic gripper arm, which will place the wings in trays. The smart solution will enable fast, hygienic and high-quality packing with minimal health risk for operators. Industrial trials will ensure the solution’s stability.
How can AI-based digitalization of yogurt production increase efficiency and reduce costs? IQOFTHINGS along with MANDREKSA SA and 3CASTALIA TECHNOLOGIES SMPC have set out to find out in their S3FOOD project, YOGUSENSE.
Process Analytical Technology (PAT) is a system for designing, analyzing and controlling processes. This is being done through measurements of critical process parameters and quality attributes of raw and processed materials. The aim is to improve product quality, increase efficiency and reduce costs. PAT has effectively been employed in high margin industries.
YOGUSENSE aims to capitalize on the capabilities of soft sensing to address quality, efficiency and costs. The project also aims to mainstream the application of PAT among small and medium-sized food producers by demonstrating in a fermentation-based process the commercial feasibility as well as the economic and environmental benefits of a PAT platform. Such a platform fuses data from sensors, analyzers etc. to create optimization models for in-process and real-time quality control.
– easics booth at VISION 2021 in Stuttgart – live nearbAI demo at Bits&Chips Event 2021 in Eindhoven – nearbAI prototype board supporting FPGA SoMs and ASIC test-chips – nearbAI talk at VISION 2021 in Stuttgart
embedded AI close to your sensors
nearbAITM is easics’ trademarked product for embedded neural network inference using digital hardware, applied close to your sensors. Its prime application is pattern matching in the supervised learning paradigm. nearbAI consists of two parts: a configurable semiconductor IP core that gets instantiated on your custom ASIC or on an FPGA, and software tools to configure your IP core. It is offered in a licensing model. You can contact us for a free nearbAI Estimator tool license to evaluate the performance of your AI application on the nearbAI IP.
high performance, low power, low cost – your choice
nearbAI targets embedded pattern matching applications close to the sensors, where at least one of the following plays a pivotal role: ultra-low and non-variable inference latency for real-time reaction speeds, ultra-low power consumption for battery-powered operation, lowest hardware component cost for high-volume applications. These applications include novel ultra-low latency AR and VR glasses, battery-powered healthcare wearables, human-in-the-loop medical diagnosis equipment, smart product scanners for retail applications, self-navigating drones, collision-avoidance in vehicles, sophisticated in-line quality inspection in industry 4.0 and earth observation in satellites. Targeted sensors include various types of image sensors, MEMS-microphones, and any novel sensors.
evaluate and finetune your application using nearbAI software
The starting point in the nearbAI design flow is a trained neural network model. You can create that using your preferred machine learning framework such as PyTorch, TensorFlow or Keras. The nearbAI Estimator tool reads in your trained model as an ONNX file. Besides that trained model, you input your desired hardware configuration and constraints in the Estimator tool. The latter reflects aspects of your use case such as targeted hardware cost (silicon area), inference speed and latency, and power consumption. The nearbAI Estimator tool shows the latency for each layer of the neural network and highlights any hardware bottlenecks. This allows you to interactively finetune the constraints, evaluate the resulting performance, and arrive at the optimal configuration for your use case, without having to build hardware or run endless simulations.
generate your proprietary nearbAI core
Next, you use the nearbAI Core Generator tool to generate your proprietary nearbAI semiconductor IP core. You further use the nearbAI Network Compiler to generate the microsequence that will sit in memory next to the IP core to run your neural network. Doing ASIC and FPGA design services as well, easics is ready to assist you in integrating nearbAI on your chip.
optimize your return-on-investment. from any FPGA implementation to your ASIC instantiation.
nearbAI supports efficient hardware mapping on your custom ASIC as well as on Xilinx and Intel FPGAs. Evaluation boards using FPGA or FPGA System-on-Module (SoM) are available, and support plug-in of your ASIC test-chip. The same nearbAI core can run several neural networks using different microsequences. This way, a nearbAI-based product supports field upgrades as well as on-the-fly hot switching between neural networks in a running application. The fine-grained configuration options further allow you to generate several flavors of a nearbAI IP core, such as a low performance and a high performance version. This enables the efficient roll out of a product family. All in all, nearbAI future proofs and optimizes the ROI of your embedded AI developments.
we’ll be back
From 5 till 7 October, easics participated in VISION, the world’s leading trade fair for machine vision in Stuttgart. easics gave live nearbAI demos, and presented it in a talk at the Industrial VISION days, organized by VDMA Machine Vision. On 14 October, easics presented nearbAI at the Bits&Chips Event at the Evoluon in Eindhoven. Upcoming opportunities to meet up and see live demos are the IP-SoC conference in Grenoble on 1 and 2 December 2021, and Embedded World in Nuremberg from 15 till 17 March 2022.
Contact easics for more information and for a free nearbAI evaluation license to try it out yourself:
In The S3Food project Flying Couch Brewery in Denmark has the opportunity to start a project together with Zymoscope. Various beer fermentation processes will be conducted in order to collect sufficient amount of data, to improve accuracy and reproducibility while ensuring scalability of the solution.
The goal of the project is to finalize the development of a data-driven fermentation management solution i.e. non-invasive Internet of Things solution to monitor and control fermentation in the beer industry.
9 out of 10 microbreweries in Europe do not have access to the right tools to monitor and control fermentation. If any, the breweries end up using analogue solutions, which require time-consuming manual sampling and include weak data flows. A lack of appropriate quality control results in low and inconsistent product quality, production inefficiencies and large amounts of beer and CO2 spillage.
In Europe there are over 11,000 microbreweries, which means such a solution is very welcome in the industry. Fermentation processing in microbreweries, is of high complexity due to the involvement of various raw materials, different yeast strains, numerous quality parameters, biased factors, and interventions throughout.
This project has received a validation voucher of S3Food: this voucher includes financial support for “the validation of a digital solution in a relevant environment related to a selected challenge of the food sector”.
A new breakthrough in the S3Food project. Food waste is a gigantic and increasing problem in the developed countries and requires immediate action as this constitutes a massive loss of resources throughout the food production value chain. This project aims at constructing, testing and optimizing a pilot process line to receive, handle and process bread waste. The technology enables the bread sector to reduce food waste by 50 % by 2030 and partly fulfilling UN’s sustainability goal # 12.3
The goal of this project is to integrate real time sensor technology in an automated process that separates bread waste, such as day-old bread and packed bread into homogeneous categories. By combining data from sensors with known product recipes, the composition of a finished bread waste product can be optimized, which after drying and grinding will become ingredients in new sustainable foods such as granola, bread crackers, muesli bars, pasta, bread chips etc.
Funded project within S3Food: Next Generation Sensors B.V. and CIED B.V.in the Netherlands aim to develop a portable real time scanner for contamination in food production, called “The Revolutionary Real Time AgroFood Contaminant Screener”.
S3Food is the pan European project for digital Industry 4.0 transition in which DSP Valley takes part as one of the 13 international consortium partners. It funds this great project that will develop the portable real time scanner for contamination in milk production. The second-largest food type in the agri-food testing market is the dairy industry, which is why the project will start with that industry as the launching market. After the product launch meat and fruit/vegetable companies – and food/feed companies in general – will follow rapidly after that.
An unprecedented portable contaminant screener will be delivered in this Lab-to-Sample. The development of Minimum Viable Product (MVP) allows the partners to rapidly commercialise their joint innovation. The proposed screener’s underlying technologies are:
– an unparalleled portable mass spectrometer – a novel rapid sampling probe – a novel blockchain ledger – and machine learning algorithms
These algorithms are resulting in a revolutionary real-time dairy contaminant screener. Accurate, real-time contamination detection at the source would:
1) prevent contamination extension because of milk pooling. This is significantly reducing food waste. 2) reduce analysis costs and 3) drastically reduce the need for costly food recalls
Next Generation Sensors B.V. + CIED B.V. are also collaborating with The Maastricht MultiModal Molecular Imaging Institute (M4I).
easics, a Leuven-based market leader in the embedded systems digital design, switches into its next gear as a company. Next to providing unique competence and development platforms that lead to first-time right, reliable and optimized logic and software that is maintainable by the customer, easics starts providing network and Artificial Intelligence (AI) product/IP solutions for system developers of health and industrial products. easics nearbAI technology provides a compact, low-power and affordable AI core that runs complex neural networks on FPGAs and ASICs at the edge, close to the sensors. This results in a low and predictable latency and runs with ultra low-power consumption. easics’ embedded AI solutions integrate tightly with novel and existing sensors such as image sensors capturing light inside and outside the visible spectrum (such as hyperspectral and thermal infrared), 3D scanning laser (LiDAR), Time-of-Flight (ToF) sensors, radar, microscopy, ultrasound sensors, and microphones, and thus enable many novel businesses.
To enable this growth ambition Emiliano D’Agostino joined the MT as managing director & CEO of easics as of April 2021. He is an executive with hands-on experience in the medtech industry. Before joining easics, he has founded DoseVue, a medtech company which he has brought from incorporation up to commercial product launch. He received a PhD in electrical engineering, focused on medical image processing, from the KU Leuven in 2006. He holds master degrees in physics engineering from the Université Libre de Bruxelles and in nuclear engineering from the Politecnico di Milano.
“I am really looking forward to helping easics moving to the next phase by expanding easics’ product related activities, in particular in the medical sector. The company has a tremendous potential and is already a trusted partner for several big international players in the medical and industrial fields.”, says Emiliano. Rector-emeritus of the KU Leuven, Prof. Oosterlinck, and since its foundation 30 years ago Chairman of the Board of easics, welcomes Emiliano: “Emiliano is an international manager with very good knowledge of the health and industrial business, and an excellent alumnus of the medical imaging research group of the KU Leuven, that I started more than 45 years ago. I am sure that he will bring easics to the next level.” “We welcome Emiliano to the MT of easics and look forward to growing the company further with him,” say Ramses Valvekens and Steven Coenen, the managers and main stakeholders of easics.
Visit easics at their website and follow them via their social media channels: