Skip to main content

Design and development of REAlistic food Models with well-characterised micro- and macro-structure and composition

Final Report Summary - DREAM (Design and development of REAlistic food Models with well-characterised micro- and macro-structure and composition)

Executive Summary:
In terms of composition and structure, foods are very complex systems. Although scientists have a good grasp of the former, the control of food structure remains difficult. Nevertheless the knowledge of the structure and their change during processing are key data for understanding the effect of food on the human health and well-being. DREAM project succeeded in developing food models with well-characterized structures for simulating the impact of processing on the nutritional and microbiological food properties. The development of standard food models representing each a major food categories made it easier, for public and private research partners, to pool their knowledge and to enable partners of the food industry, especially SME's, to benefit from models that are both generic and realistic enough to optimize their existing processes or to come up with new ones.
To address the broadest possible range of food products and take into account their high variability, foods were classified in four generic structural groups. For each group a relevant specific risk and/or nutritional benefits were identified and the process – structure- functionality determined and modelled.
Food types are 1) filled cellular solid represented by fruits and vegetables, 2) proteinous cellular network represented by meat (pork muscles), 3) combined gelled/dispersed systems represented by dairy desserts and cheeses, 4) open solid foam represented by cereal products (bread and biscuits). The innovation brought by DREAM consisted in applying cognitive science to integrate know-how into scientific knowledge to the development of model foods and their standard operating procedures (SOPs) for use by food industries, an up until now unprecedented approach.
On fruits and vegetables, characterisations of the change of products macro- and microstructures before and after processing leading to changes in the bioaccessibility of selected phytochemicals have been determined. The phytochemicals under consideration were lycopene for tomatoes, procyanidins for apples and glucosinolates for brassica vegetables. Results were integrated in mathematical models to predict the transfer of procyanidin from fruit to juice depending on processing conditions or the effects of processing on bioconversion and bioaccessibility of glucosinolates. On pork muscles, key variables linked with in vitro protein digestion of cooked meat at two different temperatures were identified and a mimetic model was elaborated allowing the determination of the kinetic laws that govern changes in protein denaturation and oxydation. On dairy dessert, a mathematical model was developed predicting the texture of dairy emulsion by coupling kinetic model of protein denaturation with model of simulation of the fat droplets interface colonisation by proteins and bridge creation between droplets. On cheeses, two reproducible generic cheese models were developed. The definition of all cheese-making control variables and all cheese-making state variables were given in a basic document for cheese makers. On cereal products, generic bread and biscuit models were developed to study the influence of added dietary fibre on product quality and to develop mathematical models to describe dough formation and product quality.
These models were also used to assess nutritional properties as well as microbiological food safety. The in vitro digestion and in vivo studies highlighted the effect of the different processes in changing the physical and molecular structure of food and consequently the bioaccessibility and bioavailability of nutrients or bioactive compounds. The quantification of the impact of process and storage on microbial food safety and quality was obtained after inoculating the model foods, following the microbial behaviour and implementing the decision making tool to simulate their performance in food.
The applicability of these models was also tested at the industrial level with feedback/suggestions to the model developers for further improvement. An industry guide for food modelling was finalised which summarizes the practical use of the DREAM models but also create awareness and encourage the use of modelling in the industry. During the project dissemination contributes to ensure exchange of information with the scientific community, to improve informing the European citizens and to make the best possible use of the project results by the food industry and concerned authorities (training and career development, dissemination to the industry and food authorities).

Project Context and Objectives:
In terms of composition and structure, foods are very complex systems. Although scientists have a good grasp of the former, the control of food structure remains difficult. Nevertheless the knowledge of the structure and their change during processing is the key for understanding the effect of food on the human health. DREAM project aimed to develop food models with well-characterized structures for simulating the impact of processing on the nutritional and microbiological food properties. The development of standard food models representing each a major food categories made it easier for public and private research partners to pool their knowledge and to enable partners of the food industry, especially SME's, to benefit from models that are both generic and realistic enough to optimize their existing processes or to come up with new ones. Scientists also need generic but as realistic as possible models that can mimic food structure complexity. Such models would make it much easier to assess the impact of a change in composition or of processing conditions on the nutritional and health properties of foods.
To address the broadest possible range of food products and take into account their high variability, foods were classified in four generic structure groups: filled cellular Solid (fruit and vegetables); proteinous cellular network (meat); combined gelled/dispersed/aerated systems (dairy products such as yogurts, creams and cheeses); open solid foam (cereal products such as bread and biscuits).
The innovation brought by DREAM consisted in applying cognitive science to integrate know-how into scientific knowledge to the development of model foods and their standard operating procedures (SOPs) for use by food industries, an up until now unprecedented approach. Industrial partners (SOREDAB and UB) and five industry-oriented organisations (ADRIA, CCFRA, CCHU, ACTILAIT and TIFN) were integrated into the project, contributing to specification, providing validation feedback for overall improvement and standardisation. The applicability of the model foods and food models were assessed before transferring the protocols and disseminating the gained knowledge to industry and other stakeholders (EFFoST, The European "Food for Life" platform and national platforms, the CIAA and national federations, EFSA and national regulatory bodies).
DREAM project followed a sort of V-cycle aimed at enabling both specification and validation of the developed models by stakeholders of the agrifood sector. At a first stage, stakeholders’ needs as well as knowledge – notably but not exclusively concerned industry’s and authorities’ – were gathered and processed by WP1 in order to translate them into specifications for the development of the models foods and food models (WP2, WP3, WP4, and WP5) in a second stage. At the third stage, the applicability of the model foods and food models were assessed before transfer of the protocols (WP7) and dissemination of the gained knowledge to industry and other stakeholders (WP8).
The challenge of the WP1-Mathematical knowledge integration for food model numeric simulation was to develop or adapt new applied mathematical tools able to predict the emerging organization of a food model at different scales and the functions associated to it, so called IKM’s (Integrated Knowledge Model). To do that, several authors had shown the necessity to resolve the enormous challenge of unifying complex and dissimilar data, knowledge and models, specifically to understand the dynamics of such a complex food system (Perrot et al., 2011). It is particularly true when it is applied to real systems. Food structure prediction and structure-function relationships are only well established for simple – or “simplistic” – models of foods (gels, emulsions, dry foams…), which are far from the foods they are aimed to represent, at least for their composition and function, and thus useless with respect to industry’s needs. Nevertheless the improvements of food processes by optimization methods are restricted to the few applications where the mathematical modeling is complete (Banga et al., 2003). In this context the development of mathematical approaches having the ability to take into account the heterogeneous knowledge and simultaneously the uncertainty on the system were promising and has been investigated in DREAM.
The WP2-Filled cellular solid model aimed to develop well-characterised, realistic, food models for plant foods able to serve as tools for integrating and harmonising food and nutrition research on plant food products. Fruit and vegetables are physiological active products that, even for the same variety, show variation in many characteristics depending on the cultivation conditions (location, soil, weather, light...) and post-harvest conditions. Therefore even with standardised protocols variation will remain. Therefore there is a need to link processing behaviour with measurable characteristics that are determining kinetics of nutritional changes. Mechanistic mathematical models are needed that are robust to deal with the natural variation and still give meaningful results to be used for product and process optimization with respect to nutritional, sensory and safety quality attributes.
Heating is one of the most important processes applied to animal tissues, since meat is usually consumed after cooking at industrial or domestic scale. This physical process induces structural and chemical changes that can affect meat nutritional value. The work developed in WP3- Proteinous cellular network model was focused on meat and the impact of heating on its protein physicochemical changes and digestibility. Homogeneous muscle models are needed to evaluate the effect of different processing parameters on the nutritional quality of meat. One objective was to define the best experimental models representative of meat products by selecting and assessing meat tissues or by creating artificial mimetic samples. Another objective was to evaluate the effect of compositional and structural properties of proteinous foods on the reactions promoted by heating.

In the WP4- Combined gelled/dispersed/aerated systems model, at the start of the project there were no gelled or dispersed model systems that could be consistently reproduced and that could be made widely available for testing nutrient, allergen or toxicant release, microbiological safety etc. Therefore we aimed to produce well characterised, realistic, food models for these types of food systems including cheeses and desserts that could serve as tools for harmonising food quality, safety and nutrition research on these types of food products. We also aimed to produce mathematical models that could relate model composition such as fat content, protein ratio and processing to model functionality such as nutrient release and texture.
In the WP5-Open solid foams model: Open solid foams represent the structure of baked cereal products, the properties of which are closely related to the intrinsic properties of the solid material, its density and cellular structure. The solid phase is a composite material blend of biopolymers and components of lower molecular weight. The size and distribution of pores is also important in the product behaviour. Fracture properties and dynamics of water in the matrix are important for the sensory quality of products based on solid foam structure. The micro and macro level structure is likely to have an important effect on digestibility and delivery of small molecular weight components in the human gastrointestinal tract. The aim was to develop models for open cereal solid foam foods with special reference to effects of whole grain and fibre on structure.
In WP6-Model food applicability: Change in composition or process conditions may deeply impact the microbiological quality and safety of foods, as well as the bioaccessibility of toxic or bioactive compounds and therefore their absorption during digestion. To assess their reproducibility and industrial practicability, Generic Model Foods (GMF) were standardized and validated. Their physico-chemical characterization can provide information on bioaccessibility of nutrients and/or toxicants, fundamental for understanding the effect of different processes on the bioavailability - in vitro and in vivo studies - of selected nutrients and their impact on consumer health. Additionally, GMF can represent a powerful tool for microbiologists. According to microbial behaviour of beneficial, spoilage and pathogen populations, the use of GMF can help in optimizing formulations, processes and storage - finally improving risk assessment - via the implementation of available decision making tools.
In WP7-Technology transfer: the use of existing modelling approaches as time saving and cost effective tools for assessing and optimizing processes and their impact on product quality and supporting decisions during the process and product development activities is currently relatively limited in the industry, especially at SMEs. Technology transfer was integrated into the project to promote that the model development is conducted in collaboration with food manufacturers through industry needs and feedback approach. The objectives were to ensure that the practical needs of the industry, especially those of the SMEs are considered during the development of the real model foods; to convert results of the research into appropriate format that can be used by the industry as simple process and product development tools, and to test their applicability at industry level and provide feedback for further improvement.
In WP8-Dissemination: as DREAM meets important industrial and societal issues while addressing fundamental scientific issues we paid a particular attention to technology transfer since the consortium is convinced that the project results are to improve competitiveness of the European agri-food sector, which was one of the rationales of the project. As the project was designed to respond to some of the most important issues listed by the European Technology Platform on Food for Life, which expresses industrial expectations from scientific research, industrialists will certainly be interested in appropriating the project outcomes. Another prioritized target of the project is constituted by food authorities. Last not least, European citizens will certainly be interested by the societal stakes of the project as it responds to some of the “hottest” societal topics that are food safety, nutrition patterns and food supply. For that reasons was important to disseminate outcomes as soon as possible and to relevant stakeholders. Our vision was to make the best possible use of the project results by the food industry and concerned authorities, ensure fruitful exchange with the scientific community, including individual scientists and initiatives and inform European citizens about the societal stakes of the project and the way their money is used.

Project Results:
The description of the main S&T results and foregrounds has been released on the DREAM book of results:

WP1-Key result 1: In silico comprehension and prediction of the structure and texture of a dairy dessert.
Dairy products have been experimentally shown to behave like complex systems: Their resulting textures depend on various factors, including their composition and their processing conditions. Out of these processing conditions, the most influential are: the nature of heat treatment, processing parameters applied during acidification and during the homogenization process. Nevertheless being able to predict the texture upon process conditions is an interesting challenge for the industry. The work developed under the WP1 was to answer to this challenge by developing an in silico model able to integrate the available knowledge and the uncertainty on the domain. The originality of the approach is about the integration of recent theoretical developments crossing over applied mathematics and computing science. The predictions of the model have been validated on experimental data reached under the WP4 applied to different neutral dairy emulsions and generalized to a cream cheese model.

Research aims and background: Dairy products have been experimentally shown to behave like complex systems: Their resulting textures depend on various factors, including their composition and their processing conditions. Out of these processing conditions, the most influential are: the nature of heat treatment, processing parameters applied during acidification and during the homogenization process (Foucquier, 2011). From an industrial point of view texture of dairy products have crucial importance. Consumer appreciation of dairy desserts such as gels, yogurts or cream cheese variants is influenced by the texture of the product. Due to difficulties of thorough experimental product characterization along the production chain, mathematical simulation and modeling approaches are well suited tools to gain deeper understanding of how the composition and some of the processing parameters can be related to the final structure of the product. Nevertheless the task is not easy because of the complex interactions that can occur between key variables at different scales (Perrot, 2011) if we want to mimic a real system. The model proposed is based on knowledge originated from different domains (physical chemistry, microbiology, computer science, applied mathematics, etc.).
For understand the macroscopic properties of a dairy dessert, the story starts at a lower scale: the nanoscale level (Descamps, 2013). A dairy dessert is an oil in water emulsion stabilized by milk proteins. From the behavior of each type of the particles, whey denaturated proteins or native whey proteins, agregates of proteins and casein micels, depend the organization of the macrostructure of the gel. Particles are indeed in competition to colonize the interface of a fat globule. From this dynamic system is reached a more or less regular and stabilized fat globule interface. From this individual interface and the connexion between all of them, is going to emerge a connected organization at an upper level. We propose an integrated model that mimics in silico this competition. Knowledge of the laboratories working in the WP4 has been integrated in the structure and some parameters of the model. Last developments crossing over applied mathematics and computing science have been included in the mathematical functions allowing to couple deterministic and stochastic algorithms.

Results and successful application: The model is a coupling between a first order differential kinetic model of protein denaturation (M1) and a stochastic model of simulation of the fat droplets interface colonisation (M2a) and bridge creation between fat droplets (M2b). Coupling M1 and M2 allows to simulate the emergence of a network at a mesoscale level from local droplet considerations. The inputs needed are the initial relative concentration of each particle in the solution, the droplet size and volume distribution and the thermal denaturation rate. The fat droplet local interface organization is predicted through two dimensions: the interface composition: % of the different particles fixed at the interface and interfacial concentration. A good prediction is observed for data reached during experiments led by the WP4 partners (Descamps, DREAM congress). The macroscopic structure is explorated upon the number and organization of the links between the particles in the space. For example, the percentage of caseins has been evaluated by the experts to have a strong influence on the perception of the texture of the product which has been shown in the literature to be in link with the number of fat droplet connections. The prediction is in good accordance with this knowledge. Thus for a ratio 80/20 of casein/whey proteins, a weak viscosity is measured (40 Pa.s at 0.001s-1) and few local interfaces are predicted to be connected. In the contrary for a ratio 5/95, the gel is structured with a viscosity of 2600 Pa.s at 0.001s-1) with a prediction of a highly connected structure.

This approach has been generalized and applied in collaboration with data coming from the industry to a cream cheese model (Jelinko, 2013). The experimental data and model simulations show similar tendencies, also demonstrated in previous studies, where increasing homogenization pressure in cream cheese production resulted in increased storage modulus ('G) in the interval of (0 – 50 Mpa) and increased texture and number of links.

Significance and benefits: This work is a first contribution able to predict real dairy dessert structures crossing over the scales. The fertilization of the different disciplines: food, computing and applied mathematics has led to a model of understanding and comprehension that can be used by the industry for optimization or sustainability purposes.

Prospects and challenge: A generalization of this approach is needed to be sufficiently generic to cover the wide range of dairy desserts produced in the industry. For that a deeper understanding of what it takes place at the interface is necessary coupled to complex system tools developed specifically to handle this understanding and to propose way of process optimization.

WP1-Key result 2: Reduction of complexity of an in-silico milk-gel model, using visualization and optimization
Replicating in-silico the structuring dynamics of food models is a relevant challenge for a better understanding of these systems. It is then important to simplify as much as possible the structure and the number of parameters of in-silico food models: given their intricate structure, however, it becomes hard for experts to thoroughly explore the behaviour of the system and search for meaningful correlations between parameters. In this work, developed under the WP1, we combine visualization with model exploration to search for correlations in an established computer model of a milk gel, following the subsequent steps: (a) data are collected during the computations of a learning algorithm, (b) data are made available via a multidimensional visualization tool, (c) subset selection tools and navigation in the multidimensional parameters space help the expert to evaluate the behaviour of the model. Through this approach, we found a correlation between two parameters of the model, that we were able to support with a formal analysis.

Research aims and background: If the structural characteristics of pure protein aggregates submitted to heat treatment are widely studied (Rabe 2011), research on aggregates of casein coupled to whey proteins (denatured or not) is still in the initial stages (Morand 2012). Models that are built become more and more complex, and necessitate the use of robust and efficient algorithmic techniques. This work is a contribution to the design of such complex models: it addresses the question of parameter learning using robust optimization techniques and visualization. The issue here is to show that the observation of the behaviour of an optimization algorithm yield important informations about the optimization problem itself, and as a consequence on the model under study. The experiments conducted in this work were based on an evolutionary algorithm (EA), a stochastic optimisation technique that relies on the computer simulation of natural evolution mechanisms. EAs are specially well suited for the resolution of difficult optimization problems, and particularly for learning the optimal parameters of complex models (Baeck 1993). Classical uses of EAs only consider the best individual of the last population as an estimation of the optimum, but recent works points out the potential benefit of visualizing data collected during the execution of an EA (Lutton 2011), and shows how a multidimensional visualisation tool, GraphDice (Bezerianos 2010), can help to efficiently navigate inside the data set collected during the execution of an EA.

Results and successful application: Data has been collected during two emulsification experiences (respectively used as training and validation sets), where the continuous phase of the emulsion is formed by dissolving milk proteins in permeate. The analysis is based on a model previously developed (Foucquier 2011), that predicts the structure characterized by the percentages of adsorbed caseins and native whey proteins, and the interfacial concentration. This model depends on 5 unknown parameters, that can be learned from experimental points (learning set), using an EA that searches a 5 dimensional space. A visual exploration (using GraphDice) of the set of points visited during the optimization process shows a convergence toward a rather large area of values for a couple of parameters, highlighting some evidence about a possible dependence between these parameters. This evidence has then been confirmed by a mathematical rewriting of the differential equations of the model, allowing to consider 4 unknown parameters instead of 5. An optimization run within this reduced search space yield a good fitting of the 4 parameters model. These results have been confirmed on the validation set.

Significance and benefits: Even for skilled scientists, it is often extremely hard to validate the behaviour of an in-silico model: due to complexity an extensive exploration of the search space is often impossible. The proposed methodology generates a limited amount of data that is likely to be of interest for the user.

Prospects and challenges: This work can be extended to a general methodology for model exploration. Such a technique could be invaluable to assist experts in assessing the validity and the weak points of their in-silico models, both in the agri-food and other domains.

WP1-Key result 3: Coupling viability theory and active learning with kd-trees
The mathematical viability theory is very useful when trying to confine controlled dynamical systems into a set of desirable states. It defines states area (the viability kernel) where the system can evolve safely. Unfortunately the computation of the exact kernel is a complex and computationally intensive task. We propose a method coupling viability algorithm and active learning with kd-tree in order to provide a compact representation of the viability kernel and to limit the call to the model, which, in case of food models, are generally time consuming.

Research aims and background: Viability theory is a set of mathematical and algorithmic methods proposed for maintaining the evolutions of controlled dynamical systems inside a set of admissible states, called the viability constraint set. This framework has proved in the recent past to be useful for food control in domain where the knowledge of what exactly is a good candidate objective function for optimization is not so clear [Sicard et al, 2012]. Unfortunately the algorithms that are available presently have a complexity which is exponential with the dimension of the state space. This is a severe limitation to the use of the method in real applications. The work we present aims at providing a compact storage of the set being computed, in order to limit the number of calls to the dynamical model. It aims also at being reusable.

Results and successful application: An algorithm based on kd-tree has been developed in order to learn the boundary of an hyper-volume [Rouquier et al, submitted], considering the viability kernel as a classification function [Alvarez et al, 2010]. The objective was to limit the number of call of the underlying model. A second algorithm has been developed in order to compute the viability kernel of a viability problem (a dynamical system defined as a black box and a set of desirable states), using the previous kd-tree storage algorithm. This new algorithm has been proved to converge to the true viability kernel [Alvarez et al, 2013].

Significance and benefits: This work is a crucial step in order to provide a complete set of tools to compute the viability kernel and capture basin of viability problems. The code of the algorithm is open source and will be freely available at the ISC-PIF forge. Sustainability problems (such as resilience study) and control problems which can be defined as viability problems could then be studied with the algorithm we provide.

Prospects and challenge: This method focuses on the boundary of the viability sets rather than on the sets themselves, so it allows to consider state space with one additional dimension. But it still suffers the curse of dimensionality. Further work is in progress in order to parallelize the refining part of the algorithm. Real application of the algorithm will also be implemented in order to validate the platform.

WP1-Key result 4: An integrative modelling of quality and safety of vegetable during thermal treatment: application to the cooking of Broccoli
Mathematical modelling can be very useful in the food industry for control and design of processes. Thermal treatment is a widespread process and literature provides several models, from mechanistic to purely based on data. Unfortunately, they often focus on the process itself or on a single macroscopic indicator of the food product. As a consequence, model coupling is a required step to provide an overview of the food macroscopic characteristics. Process control so implies to find a tradeoff between complexity of the model and accuracy of the prediction, allowing to use the tools of control theory. In this work, we propose a model taking into account food quality, including consumer acceptability, and food safety, applied to the cooking of Broccoli. The results show a satisfactory prediction, albeit improvable, of the experimental data. Furthermore, the low complexity of the model makes it a good candidate for control application.

Research aims and background: Glucosinolates (GSs) are beneficial components present in Brassica vegetables, which shown an ability to reduce the risk of several cancers. The concentrations of these compounds are strongly affected by the processing of vegetables, especially the heat treatment. A model has been developed to describe the fate of GSs during thermal processing (Sarvan et al. 2012). However, this model alone is not suitable for process control as maximisation of GSs concentration corresponds to raw cabbage. Indeed, optimisation of the control temperature only makes sense for models requiring antagonistic constraints on temperature, e.g. low temperature to preserve the GSs content and high temperature to ensure food safety. Our aim is to extend the previous model to other macroscopic features of vegetable during heat processes, related to the consumer acceptability (colour, texture) and the food safety. Literature provides simple models for this purpose, dealing with texture (Rizvi and Tong 1997) and colour (Tijskens et al. 2001). Inactivation of foodborne pathogens is extensively studied but the recent works mainly focus on the Weibull frequency distribution model (Mafart et al. 2002).
In addition, efficiency of the tools from control theory is directly related to the dimension of the model.
The issue of this work is to show the feasibility of well-predicting the dynamical behaviour of several macroscopic characteristics in a low complexity model.

Results and successful application: A model has been developed in order to reproduce the dynamics of the concentration of a single (or mean) GS in vegetable and cooking water, colour and texture of the vegetable and concentration of the biological pathogen. The single control variable is the temperature which is assumed to be homogeneous in both vegetable and cooking water and equal to the temperature of the heating device. As there is no available data on thermal inactivation of a specific biological pathogen for this study, a “virtual” biological pathogen is considered, allowing to test different thermal resistances. Finally, the enzymatic degradation of GSs was neglected so that all sub-models (GSs, texture, colour and biological pathogen) are independent, the only link being the temperature dependency.
The corresponding mathematical system (not shown) is a set of 7 ordinary differential equations with 16 parameters. Parameter estimation has been performed on data from the cooking of Broccoli. The fitting between experimental and simulated results, for some of the state variables. Due to the low complexity of the mathematical system, analytical solving has been performed, leading to expression of direct function of time for all the state variables (result not shown).

Significance and benefits: When aiming at control of food process, model coupling occurs. This sensitive step could lead to increasingly complex models which can be counterproductive. We show the feasibility to provide an acceptable fitness with a low complexity. An added benefit of the model is its simplicity which enables the practical use of the model for food industrial, e.g. by including the analytical solution in a spreadsheet.

Prospects and challenges: While the presented model shows an acceptable fitting of the experimental data, it must be extended to a larger range of control temperatures, including kinetics.
Further work is in progress to apply viability analysis tools in order to find control temperature kinetics leading to the identified targets.

WP1-Key result 5: Semi-supervised learning of a biscuit baking model, using symbolic regression and Bayesian networks
Machine learning methodologies can be an important aid to modeling experts in the food industry, allowing them to obtain reliable models with more efficiency. While human expertise cannot be replaced, it is possible to exploit automated techniques to obtain several candidate models from which an expert could later choose the best, or draw general conclusions on recurring patterns. In this work, developed under the WP1, we perform a feasibility study to learn the model of an industrial biscuit baking process, using two different machine learning paradigms. Symbolic regression is employed to obtain a set of equations, starting from raw data; and an interactive approach is used to learn a Bayesian network model, after discretizing the original dataset. The results show a good prediction capability, albeit further improvements must be studied in order to produce physically meaningful models.

Research aims and background: Machine learning techniques have gained increased popularity in recent years, mainly due to the increasing complexity of problems faced in industry. Symbolic regression, an evolutionary technique based on Genetic Programming (Koza 1992), is able to automatically reconstruct free-form equations from data, uncovering hidden relationships between variables in a dataset. Commercial software using symbolic regression is already available to the public domain (Schmidt 2009). While symbolic regression can work on the original data with no modification, models produced can be very complex and not adherent to the physical reality of a process, thus being hard to understand for a human expert. Bayesian networks are graphical probabilistic models that work with discretized variables. They represent a set of variables and their conditional dependencies via a directed acyclic graph, and are widely used to represent knowledge in many different domains, ranging from computational biology to decision support systems. While the discretization of variables might introduce further sources of error, Bayesian network representations are intuitive for the end user; they can be validated by experts of a specific process with little to no knowledge of their inner working; and they can be even manually modified. Several research lines work on the automatic and interactive reconstruction of Batesian network starting from data (Tonda 2012) and several libraries have been developed for the purpose (Druzdzel 1999). Our aim is to perform a feasibility study on the application of these techniques to the food domain, verifying whether it’s possible to produce reliable models for a specific industrial process.

Results and successful application: Data has been collected during 16 runs of an industrial baking process by the company United Biscuits: 12 runs are used for training, 4 for validation. The variables measured are top and bottom flux of heat in the oven, color, height, and weight loss of the biscuits. Additional features include the temperature in each zone of the oven, which might vary between different experiences. Both machine learning techniques are trying to find models that predict the considered variable at instant t+1, having access only to observable values (such as top flux, bottom flux, heat in the different zones of the oven at time t, and initial value of the considered variable at time t=0). Symbolic regression is able to find several good equations to predict color, height and weight loss, performing well also on the validation set. The Bayesian network obtained through an interactive learning approach shows several relationships between variables that also appear among the best models produced by symbolic regression, thus supporting the findings of the previous step.

Significance and benefits: Modeling experts usually need a long phase of trial and error before finding a satisfying model of a complex process, such as is often the case in the food industry. Semi-supervised learning techniques can generate several candidate solutions with good fitting in a small amount of time, from which the expert can then choose the most promising or physically sound. An added benefit is the possibility of examining the candidate models for recurring patterns, which might unveil some unknown relationships between variables of the process.

Prospects and challenges: While the obtained models show a good performance even on unseen data, the machine learning algorithms completely ignore the physical meaning of the models themselves. Further studies on semi-supervised approaches are needed, in order to get high-quality results coherent with the physical phenomena underlying the processes.

WP2-key result 1: Using mathematical modelling to optimise processing of fruits and vegetables with respect to nutritional and sensory quality
The phytochemical content of processed fruit and vegetable products is highly variable and unpredictable. By studying the underlying mechanisms of changes in the contents during processing, mathematical models have been developed to simulate and optimise the thermal processing conditions with respect to the phytochemical content in the final product. Combining these models with models describing texture and colour changes depending on the conditions allows multi-criteria optimisation of fruit and vegetable overall quality. The modelling approach was illustrated with brassica vegetables and the phytochemicals glucosinolates. This case study can serve as a blueprint for the application of mathematical modelling to enhance the nutritional properties of plant foods, while respecting the sensory quality, in a much broader sense.

Research aims and background: Fruit and vegetables are an important part of our diets. The intake of fruit and vegetables is associated with reduced risk for many diseases like cancers, cardio vascular diseases, diabetes type 2, etc. Phytochemicals in fruit and vegetables show important biological activities related to their health promoting effects. The content of these phytochemicals in current fruit and vegetable products is highly variable. Next to breeding and cultivation, processing and preparation have been shown as main sources of this observed variation. Understanding the changes in the phytochemical content during processing and preparation allows the development of mathematical models to describe the effects of processing conditions on the final level. With these models product and process optimisation can be done in order to enhance the phytochemical composition of the final product, while in the meantime the sensory quality attributes of the products are respected.
The results of this research in terms of developed models and the approach to broaden the applicability to a wide range of phytochemicals in fruits and vegetables can be used by the food industry to efficiently improve the quality of plant foods with respect to their health promoting and sensory qualities.

Results and applications: As a case study the thermal processing of brassica vegetables (broccoli, cabbages and Brussels sprouts) was studied in detail. By investigating the mechanisms that are responsible for the changes in the phytochemical (glucosinolates in the case of brassica vegetables) composition during thermal processing models were developed to describe: Cell lysis kinetics, glucosinolate and enzyme leaching kinetics, enzymatic conversion kinetics, enzyme inactivation kinetic, thermal degradation kinetics of glucosinolates in vegetable tissue, thermal degradation kinetics of glucosinolates in processing water, by specific experimental set-ups the parameters in these models can be efficiently estimated.
In addition to the phytochemical model, also semi-mechanistic models te describe the kinetics of texture and colour changes during processing have been developed and their parameters have been estimated.
With the developed set of models process conditions can be optimised with respect to product quality (health promoting effects as well as sensory quality). The developed approach was illustrated on a specific group of phytochemicals in specific vegetables. However the same approach and often even the same models can be used on a much broader range of plant foods and their specific phytochemicals. What is needed to apply this is to estimate the specific parameters in the models for each case. The experimental procedures developed in this project can serve as a guideline for that.

Significance and benefits: With the developed models and experimental approaches to estimate model parameters it is now possible to efficiently describe and optimise the phytochemical content of plant food products, while respecting their sensory quality. The results can be used by the food industry to efficiently improve products, but also by nutritionists that want to have an overview of the phytochemical contents of plant foods after processing and/or preparation. Most nutritional data bases only mention the contents in raw materials and perhaps one standard processed or prepared product. With the developed model the effects of many process and preparation conditions can be estimated and used to improve the quality of intake data in e.g. epidemiological studies.

Prospects and challenges: The developed approaches to study fruit and vegetables, as illustrated on the selected varieties and quality attributes in this workpackage, can serve as guidelines to study nutritional and sensory related product properties in fruits and vegetables in a much broader perspective. With limited additional experimental efforts, the parameters of the developed thermal process model can estimated. With these specific parameters the model can be applied to simulate and optimise the content of many phytochemicals within fruits and vegetable varieties.

WP2-Key result 2: Optimization of mathematical model describing transfer of polyphenol from fruit to juice by understanding non-covalent interaction between procyanidins and pectins
Procyanidins are the main phenolics in cocoa and many Rosaceae fruit, and have a major role in their bitterness and astringency, as well as potential health benefits. Disruption of the natural matrix during processing and interaction between procyanidins, and cell walls may have a strong influence on the release, on the bioavailability, and on the biological activity of procyanidins. A mathematical model has been developed, which is already applicable to predict retention of procyanidins by cell walls, e.g. in juice extraction processes. However, it needs to be improved by adding the influence of pectin characteristics that are degree of methylation and neutral side chain composition, in order to be used by the food industry to enhance food quality.

Research aims and background: Polyphenolic compounds, including procyanidins, are commonly perceived to be found mainly in the vacuoles of plants where they are separated from other cellular components. However, many may also be associated with cellular components, such as the cell wall, especially after cell injury when vacuoles rupture during processing. This results in the release of phenolic compounds which may then associate with cell wall polysaccharides through hydrogen bonding and hydrophobic interactions. These interactions have a strong influence in the release but also in the bioavailability, and in the biological activity of procyanidins.
The cell wall capacity to bind procyanidins depends upon compositional and structural parameters, such as contents and structure of the various cell wall polymers, stereochemistry, conformational flexibility and molecular weight of procyanidins, and cell wall and procyanidin concentrations. It also depends on surrounding conditions, such as temperature, ionic strength or ethanol content. A mathematical model has been developed to describe the transfer of procyanidin from fruit to juice. Among the different polysaccharides classes (cellulose, hemicellulose, pectins), pectins are those that have the greatest affinity for procyanidins. However, the mechanism by which procyanidins and pectins interact and the structural and compositional parameters that influence their association are not known.
Understanding the influence of structural and compositional parameters during procyanidin-pectin interactions may allow optimising the initial model to better describe the effect of mechanical processing on juice procyanidin concentration.
The model developed may be used by the food industry to identify influenced parameters and to simulate them in order to optimise their process and to enhance food quality.

Results and applications: The interactions between procyanidins and pectins were studied in details by varying both the composition of procyanidins and pectins. Experiments confronted B-type procyanidin from apple with various degrees of polymerization to the different pectin substructures, such as homogalacturonan presenting different degree of methylation, rhamnoglacturonan I with different neutral sugar side chains and rhamnoglacturonan II as monomer or dimer.
The affinity constants of procyanidins-homogalacturonans interactions in solution are the highest when both the procyanidins degree of polymerization and the homogalacturonan degree of polymerization are the highest. Procyanidins interacted with high methylated homogalacturonan mainly through hydrophobic interactions.
Associations between rhamnogalacturonan I fractions and procyanidins involved hydrophobic interactions and hydrogen bonds. No difference in association constants between rhamnoglacturonan I with different neutral sugar side chains and procyanidins of degree of polymerization of 9 was found. Nevertheless, rhamnogalacturonan I rich in long arabinan chains showed lower association constants, and rhamnogalacturonan I without neutral sugar side chains showed higher association with procyanidin of degree of polymerization of 30. Only very low affinities were obtained with rhamnogalacturonans II. It seems that the ramification state of rhamnogalacturonan I limits their association with procyanidins. This might explain some of the variation in transfer of procyanidins from fruit to juice with maturation, as one of maturation features is loss in galactose and/or arabinose from the cell walls.
The influence of degree of methylation of pectins could now be integrated in the mathematical model in order to optimise it and could be used by the food industry to optimise the process to enhance food quality.

Significance and benefits: Interactions between procyanidins and pectins have been studied in details and could be now integrated in the mathematical model describing the transfer of procyandins from fruit to juice after mechanical treatment. The result obtained could be used to the food industry to improve food quality.
Moreover, the process of fining for the removal of procyanidins for clarification and astringency reduction in wines traditionally uses protein extracts in its application. The wine manufacturer could use the result obtain here to optimise their fining treatment by using fibers as an alternative to fining with proteins in winemaking.

Successful applications: The impact of temperature and duration of pressing that were identified in the initial model have been validated by the Institut Français des Productions Cidricoles for use in cider apple pressing.

Prospects and challenges: The two main research prospects at this point are:
-Application to understanding the impact of maturation on procyanidin extraction, with a thesis starting on that topic (applied to pears due to their simple phenolic composition and existence of a large range of degrees of polymerization);
-Validation of the impact of the interactions on the colonic fermentation patterns for cell wall – procyanidin complexes.

WP2-Key results 3-Tomato processing method modify the bioaccessibility of its lycopene
Tomato product is the main source of lycopene of our diet. Its accessibility, i.e. its liberation from the food matrix, and its bioavailability, i.e. the subsequent transfer to its cellular targets at the end of the digestive process, determine its true health benefit against several cancer or degenerative diseases. The accessibility is greatly influenced by the physical properties of the food matrix and especially, it is enhanced in cooked products. Hot break (HB) and Cold Break (CB) treatments used by industries to control the viscosity of tomatoes purees was shown to partially control the ability of the tomato lycopene to diffuse from puree to an oil phase. Food particle size and lycopene/matrix interactions were identified as the main factors affecting the diffusivity.

Research aims and background: Modelling the availability lycopene in response to processing was the objectives for the tomato model in the DREAM project. Lycopene has been already identified as a health benefit product. The initial step of the digestive process for such lipophilic micronutrients consists in diffusing from the plant matrix to reach the lipid phase of the emulsion of the bolus. For lycopene, the major tomato carotenoid, bioavailability is known to be enhanced in cooked products. Indeed, if most of industrial tomatoes processes do not much modify the overall lycopene content, its bioaccessibility can be greatly affected. The aims of the research were therefor to identify the factors limiting the lycopene diffusion and how these factors are modified by the processes applied to fruits. Controlling the carotenoid bioaccessibility by the process is a challenge for the fruit and vegetable industries in order to boost the nutritional value of their product.

Results and applications: The starting hypothesis to set up the model linked particle sizes to the ability of lycopene to diffuse to an oil phase. Tomato is a filled-cellular model, and then, 1- the smallest the particles generated by the process (grinding, cooking temperature), the quickest the diffusion of one nutrient, according to the second Fick law’s and 2- depending of the numbers of intact barriers (i.e. intact cell-wall or membranes) that may remind in the matrix between the lycopene and the intestinal membranes. The case of study chosen for the project was two tomato purees obtained either using cold-break (CB) or hot-break (HB) processing, which exhibited contrasted lycopene diffusion rate when they were mixed to oil in a standardized protocol. Our first results indicated that no clear difference of particle sizes could explain the contrast that we observed (Page et al. 2012). Wet sieving experiments indicated that most of the HB/CB difference was explained by the behaviour of the smallest particles. We also studied more into details, using one standard commercial tomato juice, the other physicochemical parameters affecting the lycopene diffusion, including pH effect, temperature and how much the diffusion is modified when tomato purees where mixed to an emulsion instead of pure oil. For this last, the rate of the diffusion rose, but the partition factor of the lycopene between puree and oil was not much affected. Even more, it was reduced by some interactions with the emulsifier (Degrou et al. 2013). These last results indicated that molecular interactions between lycopene and other components of the matrix may affect the availability of the lycopene, making the model more complex than expected, and making necessary a set of new experiments to recover variables that should be implement into the tomato model.

Significance and benefits: From the Dream results, it is now clear that carotenoid bioaccessibility can be deeply modified by the processing methods, and particularly if the very first temperature ramping (i.e. during the first minutes of heating, which correspond to the main difference between HB and CB purees) is efficiently applied, leading to a quick rise of the initial temperature. In this case a significant quantity of carotenoid is made bioaccessible.

Prospects and challenges: At the end of the DREAM project, the main factors affecting the lycopene diffusion from the matrix have been identified. Modelling needs now to be achieved in order to set up a mathematical model that could predict the availability of the carotenoid from the processing parameters. And, moreover, the identification of the smallest particles as the main factor of the contrast between HB and CB puree make it a further research challenge to understand the biochemical change of the fruit tissues that lead to lycopene release. Structural description of these small particles is a challenge.

WP3-Key result 1-Meat models and mathematical modelling to investigate how heating affects protein digestibility
Meat cooking is an important process applied at industrial or domestic scale that has nutritional impact. Two experimental models were developed to investigate how meat characteristics and heating conditions, respectively, affect the nutritional quality of meat: (1) meat categories that differ in their structure, composition and metabolic type, and (2) a mimetic model composed by suspension of myofibrillar proteins which composition can be modified by adjusting the concentrations of major meat compounds: iron, oxidant and antioxidant enzymes. In parallel, a mathematical model called ‘stoichio-kinetic’ was developed to integrate knowledge. The model is composed of differential equations that represent all elementary reactions. Model predictions agree with the experimental measurements. This approach can be applied to other meat processes and sensorial properties.

Research aims and background: Apart from a few exceptions, animal tissues (meat and fish) are eaten after being cooked. Heating is therefore the most important process applied to this type of foods either at industrial or domestic scale. This physical process induces structural changes, at microscopic and macroscopic levels, and promotes protein changes that can have nutritional impacts.
A great number of works had been carried out on meat and fish products to assess the impact of production parameters (genetic, breeding, feeding, slaughter conditions, processing conditions…) on meat product qualities. This is usually done by comparing samples that undergo a specific treatment to a reference sample. However, in technological or laboratory tests it is rather difficult to well control and measure all the parameters that can have an impact on a specific output. This often prevents to generalize conclusions of a particular study and leads to contradictions between studies because of the interactions between sample properties and phenomena involved.
With the aim to improve the generalization of laboratory results, our work was focussed on the development of experimental and mathematical models. We worked in two main directions:
• To define the best way to prepare samples representative of meat products either by selecting and assessing meat tissues or by creating artificial mimetic samples;
• To evaluate, using the above meat models and mathematical modelling, the effect of meat tissue characteristics and cooking conditions on the reactions which are promoted by heating and can have nutritional consequences.
The tools that were set up during the DREAM project can help engineers to develop precooked or cooked industrial products with better nutritional properties. They can be also used by scientists to investigate a larger range of processing conditions: other meat tissue characteristics, other processes.

Results and applications: Parallel works have been carried out to develop two complementary experimental models: (1) extreme examples of meat tissues representing three “meat categories” that differ in their structure, composition and metabolic type (Realini et al., 2013 a & b), and (2) a mimetic model composed by a suspension of myofibrillar proteins (Promeyrat et al., 2013 a &b).
Changes in protein state and nutritional value were related by applying two extreme time-temperature heating couples (10 min, 75°C or 45 min, 90°C) to the two extreme meat categories. The nutritional value was evaluated from in vitro digestion tests using either gastric pepsin or trypsin and α-chymotrypsin. It is clear that the variables associated with protein changes in conformation and those linked with oxidation are positively and negatively correlated, respectively, with rates of in vitro protein digestion.
The mimetic model was developed to avoid the confusing effect due to uncontrolled biological variability generally observed in animal tissue. Its composition can be modified by adjusting independently the concentrations of the major chemical compounds in meat: iron, oxidant and antioxidant enzymes… This experimental model allows to easily determining kinetic laws that govern changes in protein state and to assess how much the kinetic parameters are affected by initial product characteristics and cooking conditions.
Thermal denaturation can be modeled by a unique first order reaction. Oxidation is more complex (many interactive chemical reactions that are differently affected by pH and temperature). Thus, a mathematical stoichio-kinetic model which accounts for this complexity was successfully developed (Promeyrat et al., 2012). Simulation calculations allow to analyze and predict the effect of (i) iron content which differ from one meat to another and of (ii) the heating conditions through the time-temperature couples. The predictions are accurate enough to be used for practical purpose.

Significance and benefits: We took a big step forward in predicting the impact of practical cooking conditions on protein denaturation and oxidation through the use of both experimental and mathematical meat models. These phenomena affect protein digestibility but also various other technical and sensorial qualities. For example, the kinetic of protein denaturation is certainly linked to cooking losses and oxidation of myoglobin is known to determine colour. Thus, our strategy could be applied to many other targets.
Using a mathematical model is essential to extrapolate laboratory results to industrial process conditions. For example, protein denaturation and oxidation can now be predicted for time variable cooking conditions, since all the reactions are represented by differential equations. Moreover, these calculations can be easily inserted in other mathematical models that predict the time evolution of the temperature distribution within meat pieces during cooking (Heat transfer mathematical model based on Finite Element technique).

Prospects and challenges: Our collaborative work provides a sound basis to build a mathematical tool, or simulator that can predict the effects of various processes such as chilled storage, modified atmosphere conditioning, curing, heat treatments, etc on the sensorial and nutritional qualities of processed meat products. This approach needs that new scientific knowledge is progressively added to improve our simulator by assessing parameters associated with each individual phenomenon or chemical reaction. The strategy based on experiments with both realistic and mimetic models is certainly the best way to achieve this goal.

WP3-Key result 2- Microstructure characterization of muscle tissue by quantitative imaging
Meat results from the post mortem transformation of animals skeletal muscles. These muscles are mainly composed of water (75%), muscle cells of 3 different types (I, IIA and IIB), connective tissue and fat (3 to 5%). The muscle is strongly organized in fiber bundles more or less aligned depending on its type and functionality. The muscle structural architecture can be roughly modeled as a square lattice array of fluid filled cylinder surrounded by fluids. With the MRI technique used here, we look at diffusion: the physical parameter related to the water local motion. Diffusion in tissues differs from diffusion in free solutions because compartments made by fibers hinder and restrict water motion.
The structural meat architecture is hereby assessed, not directly and destructively as done by tomography microimaging (microscopy) but indirectly and non-destructively.

Research aims and background: Even if diffusion quantitative magnetic resonance imaging is a generic method to assess Brownian random motion of molecules, it has been mostly applied to water which is abundant in tissues and visible by MRI. Both muscular cells size and muscular cells shape as well as interactions within the different compartments of muscular cell can influence the diffusion properties. If the structure is anisotropic, as in muscle, water diffusivity also displays anisotropic behavior. As muscle, and then meat, is highly organized, the diffusion of water is facilitated in the fiber direction, meaning that, in a group of fibers having the same direction, the apparent diffusion coefficient is: maximum in the fiber direction, minimum orthogonally to the fiber direction. We decided to make use of this anisotropy to indirectly and non-destructively characterize meat microstructure.
The anisotropic water diffusion can be modeled voxelwise using ellipsoid by means of a first order tensor, according to Diffusion Tensor Imaging (DTI). Moreover, by varying the intensity of the magnetic gradient field, one can obtain the diffusion decay. This decay is then fitted by a bi-exponential curve, at each voxel, in order to obtain quantitative mapping at low and high gradient field values, exhibiting hindered and restricted diffusion, respectively.
•One first objective was to obtain high resolution imaging of meat tissue microstructure in situ by MRI using diffusion tensor imaging (DTI) at different b-values in order to assess by quantitative imaging the fiber structure of muscle tissue at the mesoscopic scale.
Moreover, knowing that the muscle structure can be roughly modeled as a square lattice arrangement of fluid-filled cylinders surrounded by fluid, we can plot the theorical diffusion decay. This plot exhibits deep gaps in attenuation curves that have been directly correlated to the geometry of the cylinder lattice.
•We have so attempted, in a second objective, to determine if such attenuation behavior can be found in the real meat structure made of aligned fibers.
The tools that were set up in this part of the DREAM project can be used as input for creating artificial mimetic meat sample and to feed mathematical models.

Results and applications: Regardless of the sample orientation relative to the magnet, diffusion weighted signals were obtained with gradient fields applied in 6 directions and then calculation (tensor diagonalization) was made in order to consider diffusion in the meat fiber oriented frame. The huge advantage of this approach is that the diffusion (closely linked to the spatial fiber organization) is then assessed free of the effects of local fibers orientations. After diagonalization, the first eigenvector corresponds to the main fiber axis direction. The second and third eigenvectors correspond to the 2 orthogonal directions of the fiber axis. The same process was applied to each 6-directions dataset at increasing b-values (ranging 100-20000 s/mm²).
Quantitative mapping: Parallel and perpendicular decays both deviate from the Gaussian diffusion, expressed by a mono-exponential decay since they follow a bi-exponential shape. This highlights hindered and/or restricted diffusion of water, out of and into muscle fibers, or exchanging between the two compartments. Diffusion maps reveal details due to spatial variations of structure at a resolution much below the acquisition resolution.
Identification of structural and morphologic components: After MRI experiment, samples were frozen at -180°C to avoid muscular cells degradation and microscopic histological observations were performed. After spatial registration, histological images were first superimposed on high-resolution images (susceptibility-weighted gradient-echo images, with 300 × 300 voxels) to identify morphologic components (fat/collagen network), and subsequently superimposed on quantitative diffusion maps.
Relationship between diffusion and muscle fibers types: To assess the architecture of the muscle, we investigated into the relationship between diffusion and muscle fibers types. Histological cuts (approximately 1mm x1mm) were observed after histoenzymological ATPase staining with pH 4.35 preincubation. At low b-values, diffusion parameters seem to correlate with metabolic characteristics of meat fibers, as highlighted by photomicrography of areas characterized by histoenzymological ATPase staining.
Variability of the dimensions of structural components: Less deep minima amplitudes on acquired data in homogeneous region are observed compared to the prediction of the mathematical models. The experimental data plot deviates from the theoretical plot mainly due to the heterogeneity in the fibers diameter, the no strictly parallelism of the fibers and also perhaps the no strictly impermeability of the fibers membranes.

Significance and benefits: It is essential in the building of a mimetic model where structural components are involved to precisely access these components in situ on real material. The tools that were set up in this part of the DREAM project and results describing structural meat components can be used as input for creating artificial mimetic meat sample and could be applied to many other targets. We have showed that characterizing the behavior of water diffusion within a voxel provides a means for describing the inner microstructure at a cellular scale, taking into account apparent diffusion coefficients measured in muscle. Using the anisotropy of water diffusion due to the highly fibrillar structure of meat, we modeled this diffusion in three dimensions using tensors. We used Diffusion Tensor Imaging (DTI) with different b-values to obtain high resolution diffusion parameter mapping of tissue which were registered to high-resolution susceptibility-weighted gradient-echo images and histological images to determine if there existed relationships between meat microstucture and diffusion observed at a meso-scale.

Prospects and challenges: The results of this research are a step ahead towards the construction of realistic food models such as meat and will make possible to feed the models built in the other workpackages. Promising results have been obtained showing structural details correlated with metabolic characteristics. To the best of our knowledge, it is the first time that a scatter-like behavior was observed in a biological matrix which seems very promising for quantifying structural information from the resulting diffusion attenuation plots. Future efforts will be put on the differentiation within type of meat muscle and meat fibers types and on the variability of the structural components. Furthermore developed methods could be applied on other food product.

WP4-Key result 1-The Dairy dessert: a model for designing tailored interfaces in food systems
The main objective of this work was to understand the impact of interfacial composition and organization on the connectivity between fat droplets and the microstructure of oil-in-water emulsions. Four kinds of object were obtained by different processes applied to proteins solutions containing various protein ratios. Their different properties led to competition between these objects at the interface generating four types of structure with different connectivity leading to various rheological properties: liquid, structured or gel. In complex dairy desserts, only the presence of aggregated whey at the interface increased the firmness of these desserts. When emulsions and dairy desserts were enriched in PUFAs, the oxidation of lipids remained low after 2 months storage at 4°C.

Research aims and background: Most processed foods contain gels and dispersions of some sort that consist of small particles such as fat or protein dispersed in another medium. There are currently no gelled/dispersed model systems that can be consistently reproduced and that can be made widely available for testing nutrient, allergen, toxicant release, microbiological safety etc. The work described here addresses this issue with a standardised dairy dessert model. The production of the dairy dessert is built on experience at IFR (UK), INRA and Soredab (France) in studying and understanding proteins/surfactant interactions in emulsions and foams, fat composition, and on the use of biopolymers such as starch or pectin as thickening/gelling agents. We hypothesized that changes in protein structure caused by heat treatment could induce different structures of the interfacial layer (difference of thickness and of homogeneity) with or without disulfide bridges. These different morphologies would then have an impact on the structure and the texture of emulsion, that could moreover affect release and bioaccessibility of nutrients contained in fat (PolyUnsaturated Fatty Acid ω-3, for example) or digestibility of proteins. Consequently, the aim of this study was to evaluate the combined effect of heat treatment and CM/WP ratio change, on the structure of interfaces, in relation with the impact on the texture of O/W emulsions. For that, our strategy was to generate emulsions with a range of well-defined different interfaces and combining different compositions (weight ratio of CM to WP: 80:20 to 12:88) and heat treatment of milk proteins (60°C or 80°C).

Results and applications: The results of the work on the dairy dessert have been published in a series of papers from INRA Nantes and INRA Grignon (Foucquier et al. 2011; Surel et al. 2013). The way the dessert model is made. Initial measurements determined the mean size of the casein micelles (CM) to be 140-160 nm regardless of processing temperature (60°C or 80°C) whereas the whey protein (WP) increased in size from 6 to 100 nm as the processing temperature increased. The emulsion droplet size was remarkably insensitive to either CM/WP ratio or temperature. The primary controlling factor of the model was the interfacial composition as a function of CM/WP ratio and processing temperature. When the ratio was below 0.2 the interface was dominated by WP and at 0.2-0.3 there was a minimum in the amount of protein adsorbed to the interface. At CM/WP ratios above 0.3 the interface was dominated by the CM and the effect was more marked at the higher temperature. These differences in interfacial composition also had a marked effect on the interactions between the droplets and thus the rheological behaviour of the emulsions.
The overall effect is that the first group with a liquid texture consists of emulsions whose proteins were treated at 60°C regardless of the CM/WP ratio and those with a ratio of 0.8 or more treated at 80°C. The second group producing a structured liquid with higher viscosity includes emulsions with CM/WP ratios from 0.19-0.26 treated at. The final groups of gelled systems are all treated at 80°C and have CM/WP ratios from below 0.15 or from 0.3 to 0.5. In these last systems the emulsion droplets are all interconnected by aggregated protein causing the system to gel. In parallel with the experimental work undertaken to develop the physical model, a mathematical model was developed that aimed at predicting the texture of the final model based on a number of inputs, including the amount of casein and whey, the size of the whey aggregates, the processing temperature and the homogenisation pressure. From this information a number of internal variables are generated that provide information on the amount of adsorbed CM and WP, the amount of aggregated WP adsorbed and the emulsion droplet connectivity. From these variables the structure of the resulting emulsion system can be predicted.
Equipment required in order to produce the model is relatively basic, comprising stirrers, heaters and an Ultra Turrax to make the pre emulsion that is then passed through high pressure homogeniser capable of delivering 50 bar.

Significance and benefits: This model offers the possibility of using a standardised dessert model with a range of textural properties. The texture of the dessert can be predicted from the amount of micellar casein and whey protein added and from the thermal treatment and the homogenisation pressure used to make the emulsion. The standardised and predictable nature of the model makes it ideal for use in risk assessment involving the growth of pathogenic or spoilage organisms or involving toxin contamination. In addition it is also suitable for use in studies on bioaccessibility or bioavalability using in vitro or in vivo methods respectively.

Successful applications: This model has successfully been used to determine lipid oxidation rates after the incorporation of kiwi seed oil containing high levels of ω-3 polyunsaturated fatty acids into the formulation. No statistically significant differences were seen in the levels of oxidation after 2 months of storage at 4 °C regardless of formulation or processing temperature.

Prospects and challenges: The development of mathematical models able to predict the formation of structure and thus the texture of dispersed systems represents an interesting prospect for the future. It also offers the possibility of broadening the range of dispersed food systems that can be included in the modelling. In particular, coordinated approaches to producing healthier food systems with lower salt and/or fat and/or sugar are becoming increasingly important. The development of standardised models of significant food types offers the possibility of designing healthier foods with the desired textural properties and shelf-life.

WP4-Key result 2-A pilot scale model for the reproducible production of soft cheese
A realistic cheese model (Brie cheese manufactured with industrial technologies was built as tools for experimental studies on cheese. A great attention has been paid to the authenticity and repeatability of these small scale models (around 1kg). Twenty replicates per cheese model were made to determine the repeatability of about a hundred state-variables and control-variables: from the milk composition to the cheese qualities. Their coefficient of variation was around 1%-1.5%. For each of these variables, statistical analyses were performed to characterize the dispersion of the data and the origin of the variability, in order to improve the models. Reliability of the models was confirmed in experiments dealing with the influence of salt content in cheese on Bifidobacteria in Brie-cheese.

Research aims and background: Many problems in the field of dairy research and development require the implementation of cheese making trials. These trials include technological topics such as process modifications, yield, raw material, spoilage microorganisms, sensory studies (effect of starter or adjunct cultures), nutritional questions (reduction of salt or fat content, decrease of the proportion of saturated fatty acids in fat and the use of probiotics and/or prebiotics) and food safety issues (survival of pathogens, presence of harmful chemicals). However, cheese making experiments are expensive and time consuming, even on a pilot scale. They require specific equipment and required environmental conditions are difficult to control. Therefore, several alternatives have been suggested for experimental studies on cheese.
The improvement of cheese models and the proposition of a strategy for the development of cheese models, as proposed in our study, are useful for the cheese industry and its suppliers (e. g. enzymes, lactic and ripening cultures, proteins, etc). Indeed, cheese models are too often considered as black boxes in some laboratories and so their improvement will increase the reliability of the results. Indeed, the dairy research contributes in a part of the progress made in the cheese industry by the improvement of the knowledge in the dairy science. More particularly, experimental results on food safety (e. g. growth and survival of pathogenic bacteria in cheese) or on the cheese milk quality are very important for the cheese makers and require suitable cheese models. The screening of the ingredients for cheese making must also be performed using viable models. Finally, the strategy of characterisation of the representativeness and the repeatability can be applied to the pilot plant in the cheese factories.

Results and applications: Preparation of the cheese milk starts with the heat treatment of raw milk (88°C/1min). This is to denature the whey proteins and to inactivate vegetative cells of bacteria present in the milk. This is followed by the preparation and addition of protein concentrate to increase of the casein content of cheese milk. The next stage is to adjust the fat and recoverable proteins content which in turn controls the fat-in-dry-matter of the final cheese. The last step in the process is the microfiltration of the skim milk (1.4 µm) and heat treatment of the cream (120°C for 1 minute) in order to remove spores and thermo-resistant bacteria.
The cheese itself is prepared as follows: The milk is prepared with starters and acidifiers at 39°C for 30 min in order to standardize the pH at renneting. The milk is then coagulated with recombinant chymosin to gel the milk, which is then cut into 1.7x1.5x1.5 cm pieces to promote syneresis. The vat containing the milk gel is then drained and the cheese placed in moulds, which are in turn drained at 32°C for 3h then 18°C. After 1 day the cheese is placed in saturated brine at 12°C for 55 min and then ripened for 12 days (12°C, 96% RH) on grids before wrapping and storing. Mean value and standard deviation were measured for the composition of the cheese milk, day-1 cheeses and the ripened cheeses respectively. A good reproducibility was obtained for all state variables with coefficient of variation around 1-1.5%
The equipment needed for milk standardization is: Pilot scale microfiltration with 1.4 µm ceramic membranes, Pasteurizer, Tri-blender (or mixer) and a skimming centrifuge. Cheese manufacture requires a cheese vat, cutting blades or wires, cheese moulds and a thermostated cheese making room. Finally, cheese ripening requires wire grids and a ripening room (RH and temperature controlled).

Significance and benefits: A detailed, realistic and reproducible cheese model was built and characterized. The model is currently used in our laboratory and could be used as a basis for model development in research institutions or industrial laboratories. A simplified version of the model has been developed in collaboration with Soredab.

Successful applications: This model has been successfully applied to study the influence of salt in moisture on the growth and survival of Bifidobacterium lactis BB12. The experimental data obtained from the model was in agreement with previous studies, showing an inhibitory effect of the increase of the salt in-moisture on butyric acid fermentation. Our results obtained in Brie cheese also confirm previous findings which showed little influence of salt content on the survival of B. lactis BB12 in cheese.

Prospects and challenges: This model offers a way of producing pilot scale cheese with low variability. One of the main limitations of the model is its complexity. Therefore, one task for further development is the simplification of the model so that it may be used more widely in the cheese making industry.

WP4-key result 3- Coupled impact of product process and composition in lipids as emulsifiers on the structure and the texture of a cream cheese model in relationship with product oxidation
A cream cheese is a dairy emulsion where proteins act both as emulsifiers at the fat globule interface and as protein network components. The objective of this work is to understand the impact of the process and the addition of lipid emulsifiers on the interface composition then on the model texture. The most discriminating step of the process on product’s rheological properties is the homogenization. Nevertheless, its influence is modulated by the heat treatment. The addition of lipid emulsifiers decreases the connectivity and hence decreases the product’s firmness. These molecules compete with proteins and replace them at the interface. The cream cheese model is not sensitive to oxidation either after intense thermo-mechanical treatment, storage, or enrichment in polyunsaturated fatty acids.

Research aims and background: The problem was to have a cream cheese model easy to produce with well characterized composition to understand the impact of the process and its composition in lipid and protein emulsifying agents on its structure, texture and oxidation stability.
There are some previous works on the impact of product process on product texture (Sanchez et al., 1996). However, there is a lack of information regarding the influence of each process step on final product.
Moreover, the emulsifying capacity of phospholipids is well known but the influence of its addition on fat globule interface, product structure and texture stays little known.

Results and applications: Regarding the impact of the process on the structure and the texture of the cream cheese, the project showed that the final homogenization pressure has a large impact on the cream cheese model texture. When the pressure increases, the fat globule size decreases. This leads to an increase in the specific surface of fat, number of interactions between particles and consequently in the cream cheese model firmness.
The pH at the end of acidification has a very limited impact on rheological characteristics of the model, but it has a significant one on its sensory properties: products acidified to pH 5.2 are considered more spreadable and brighter than products acidified to pH 4.9 (Coutouly et al., 2013).
Regarding the impact of enrichment of the cream cheese with lipid emulsifiers, the project showed that the phospholipids proves to have a larger affinity for fat globule interface than proteins. As low molecular weight surfactants (Mackie et al., 1999), they replace proteins at the interface. Consequently, the number of interactions between particles decreases and the cream cheese model firmness decreases consequently. The presence of phospholipids results in partial coalescence of fat droplets which yields softer and smoother final products.
Finally, regarding the stability of the cream cheese model, enriched or not with polyunsaturated fatty acids, and whatever the process to manufacture it, the cream cheese model proved to be very stable regarding the lipid oxidation, during 3 months of storage at 4°C.
This project allowed better understanding of how the interactions between particles and microstructure impact the final cream cheese texture.

Significance and benefits: The project was performed with cream cheese models analogous to actual products. The cream cheese model manufacture integrated a process and some ingredients commonly used in dairy industries. The different parameters studied here (process parameters such as heat treatment temperature, final acidification pH and homogenization pressure; or ingredients such as proteins or phospholipids) can be tested straightforwardly in dairy industries. The project outsides can give direct applications / solutions for industries. Thorough characterization of the product was achieved at each stage in the process and at different scales. The consequence of each variation of the process or of the formula has been characterized from nanoscale through electron microscopy to macroscale through rheology and sensory analysis. The project allowed obtaining an overview of the possible parameters that can be used to modulate cream cheese texture. These parameters include both technological and formulation factors.

Successful applications: The cream cheese model could be used in other research projects. The direct results provide information to link process, formula and product structure and texture. Moreover, this work allowed the production of various data which are currently used for mathematical modeling. Two studies are in progress: modeling protein aggregates formation in tubular exchanger during heat treatment and the modeling colonization of the fat globule interface upon homogenization and induced connectivity.

Prospects and challenges: This work raises various perspectives. Firstly, the analysis of the fat globule interface has to be more specific. In order to reach this goal, a method for the characterization of the fat globule interface in the final product should be developed.
Moreover, for further work, a source of purified phospholipids or other emulsifiers should be used. Indeed, in this work we used buttermilk isolates as source of polar lipids. Finally, it will be necessary to quantify the connectivity and link it practically with rheological properties and the texture of product.

WP5-Key result 1- Structural challenges in nutritionally improved biscuits
Biscuits are an important cereal food category, made most often of white flour, fat and sugar and consequently typically have high calorie but low nutrient density. We wanted to understand effects of adding dietary fibre (whole grain flour and wheat bran) on biscuit microstructure, texture and in vitro starch digestibility. Fine and coarse wheat bran were used to make five different biscuits with various dietary fibre content (5-15 %). The biscuits with small bran particle size had the best sensory texture. The generally low biscuit starch digestibility in vitro was increased slightly by wheat bran addition.

Research aims and background: There is a large consumer and public health demand to increase the nutritional profile and expected health benefit of cereal based snacks. Biscuits, with white flour, fat and sugar as major ingredients, are popular snacks and also breakfast items. The production steps during biscuit manufacturing do not allow starch to gelatinize completely in spite of the baking process, and the glycaemic response of biscuits is typically lower than that of bread. On the other hand, the nutritional profile of biscuits is not optimal Making biscuits of whole grain flour or adding dietary fibre in the recipe would improve their nutritional profile, but poses challenges to the open solid foam structure determining the mechanical properties and sensory quality of biscuits. We wanted to elucidate the role of dietary fibre in the form of cereal bran, as well as that of whole grain flour on biscuit structure and starch in vitro digestibility of biscuits. The aim was to be able to change the process conditions so as to maintain good sensory perception of texture and low starch digestibility while increasing the amount of cereal fibre and associated phytochemicals in the biscuit formulation. The results are important for the baking industry and biscuit manufacture in development of more nutritious products. The knowledge of interactions of dough components and bran particles benefits also producers of other baked snacks and dry products.

Results and applications: Five types of biscuit were produced containing 5-15 % dietary fibre, including a standard recipe and biscuits made with coarse (260-560 μm) and fine (25-160 μm) wheat bran. Textural measurements were made with a three point bending test. Further measurements were made by a penetrometry method for 40 types of biscuit with a wider range of fibre variations including bran and soluble inulin fibre. Starch crystallinity was analysed by differential scanning calorimetry, and in vitro starch digestibility rate was determined as hydrolysis index by an enzymatic method. Biscuits became darker with increasing bran content. Bran particle size had little effect on average colour, but the particles were visible as specks. The mechanical properties of the bran-containing biscuit matrix were affected by both by bran content and particle size. Addition of wheat bran increased the penetration energy (‘firmness’), only slightly increased the elastic modulus, and reduced the number of penetrometry force peaks (‘crunch’). Inulin also increased firmness but, unlike bran, resulted in a high initial penetration force. Bran particle size reduction had little effect on firmness, but increased the elastic modulus and hardness measured by three point bending. Bran supplementation level had a greater effect on penetrometry tests than particle size, but particle size had a greater effect in three point bending. The failure strain at maximum bran loading level was the lowest among the coarse bran supplemented biscuits. These samples had very low strain at failure which varied between 1.1-1.4%, which is unique to brittle materials. Structural factors had more impact on in vitro starch digestibility rate compared to the status of starch which was measured by degree of gelatinization. Biscuits with finely ground bran had a visually more compact structure without any surface or internal defects and were harder than those with a coarse bran particle size. Increasing fibre content from 5 to 15 % increased the hydrolysis index by 20%. The effect of bran particle size reduction in biscuit formulations did not influence hydrolysis index.

Significance and benefits: It is a food engineering challenge to increase content of dietary fibre in biscuits while retaining structural and sensory characteristics, including texture, colour and taste. There are very few reports available about adding dietary fibre in biscuits, and effects on these characteristics or starch digestibility. The use of whole grain flour or bran seems a logical way of increasing fibre content of biscuits, but no reports of their use are available. Fundamental understanding about structure- function relationships in biscuit structure and fracture properties also is limited. Current milling techniques enable efficient particle size reduction of bran, delivering new types of ingredients for the baking industry. Their applications have hitherto been reported to a limited extent. The approach of this work provided thus many aspects on novelty.
The results, when applied, would encourage the use of finely milled bran and whole grain flour as biscuit raw materials, to produce biscuits with good sensory texture and colour, and retain the naturally low starch digestibility in biscuits. The benefit for biscuit manufacturers would be the production of new, healthier products, providing consumers with healthier options. The use of bran in food production would also increase sustainability, as a side stream could be utilized in a mainstream food product.

Prospects and challenges: We hope that the results will encourage biscuit manufacturers in development of high-fibre products, thereby diluting the amount of refined ingredients in the product. The nutritional and health claim regulation by European Food Safety authority would allow a claim for fibre content of products with adequate nutritional profile. The idea of bran pre-treatment prior to incorporation in baking could also be developed further in collaboration with milling and baking industry.

WP5-Key result 2- Creation of bread cellular structure
The growth of gas bubbles in viscous matrices is the main mechanism responsible for the structure of open solid foams, like bread. The creation of the cellular structure in wheat flour dough and its heterogeneity was ascertained at different structural scales, focusing on proofing. Dough liquid phase, including sugar and fat, was studied as a stabilizer of the air bubbles in dough and bread matrix. The contribution of each scale was integrated by adapting a capillary number. We have used and further specified the model for bread in a ring test, in order to increase its fibre content, and protocols for the bread model were validated and disseminated for end users.

Research aims and background: The texture of bread is a fundamental element of its acceptability by the consumers, and it has a strong impact on its nutritional properties. For instance, increasing the fibre content leads to denser, and less acceptable breads. Like solid foam mechanical properties, texture depends on density and cellular structure of bread. So, there is a need to better understand the mechanisms of cellular structure creation during dough processing and provide models to better control process and final properties.
Wheat four dough contains about 45% water (tot. basis) and its cellular structure is created during fermentation, or proofing; during this stage, porosity increases from 0.1 to 0.7 and CO2 bubbles are connected together, although the dough does not collapse. This stability could be due to the formation of a liquid foam, constituted by a liquid phase, co-continuous to the starch/gluten visco-elastic matrix. To determine the role of this phase in the creation of the cellular structure, the aqueous phase, so-called dough liquor (LdP), has been considered as a good model for these interfacial films. In complement, elongational properties of dough can be determined by lubricated squeezing flow test (LSF), which underlines the importance of minor components. Starting from the study of bubbles growth and coalescence in model dough systems, the aim of our work was to determine the role of the aqueous phase and the starch/gluten matrix on the mechanisms which govern the creation of cellular structure at microscopic scale, and at macroscopic scale on the loss of stability at the end of fermentation.
The results are important for the baking industry, first for manufacturing breads with increased fibres content, without loss of sensory quality. Secondly, it strengthens the knowledge on the breadmaking chain by providing engineers with basic knowledge models.

Results and applications: About twelve recipes of dough were processed with varying content of sugar (0-15%), fat (0-10%), fibres (0-15%), in a range for which a typical bread cellular structure was always obtained. The elongational properties of starch/gluten matrix, measured by lubricated squeezing flow, largely influenced dough proofing stability, the evolution of which was assessed by 2D image follow-up and adjusted by an exponential decay; this result could not be explained by the single bubble growth model. Porosity kinetics determined at macroscopic level were in good agreement with results determined at microscopic levels by X-ray microtomography (XRT) at ESRF (F38- Grenoble), and both followed a Gompertz model. Analysis of 3D-XRT images showed that most bubbles were connected, for highest porosity of dough (≥0.5). The homogeneity of the cellular structure was defined from the size distributions of gas cells and walls; it was characterized by a critical thickness of walls (≈ 1μm), below which the cells were separated by liquid films. The fermented dough could thus be considered as a three-phases medium: viscoelastic matrix / gas cell / liquid phase. Dough liquor was taken as a model of this liquid phase, and extracted from dough; it behaved like a macromolecular solution, and was characterized by the surface tension (≈ 40mN/m), related to the presence of polysaccharides-proteins complexes at interfaces. The contributions of the different levels of organization of the dough were then integrated by defining a (dimensionless) capillary number that ruled the overall behavior of the dough.
These results, including process specifications, were used in a ring test to study the effect of fibre addition, in different labs; although various texture properties were obtained, they could all be integrated in the same relation between texture and density, which finally validated the open solid foam model.

Significance and benefits: Improving bread nutritional properties without decreasing its sensory properties and, mainly its texture, is a real challenge in baking industries. A better understanding of various operations is necessary to control the density and cellular structure of these products, which can, in turn be related to texture by available mechanical models. In this purpose, a common representation of these operations can be obtained by the definition of Basic Knowledge Models (BKMs), which captures the main physics of the phenomena involved. For instance, the relation between the capillary number and stability can be considered as a BKM; it suggests that the simple measurement of dough elongational viscosity and the knowledge of dough liquor surface tension can lead to the prediction of the cellular structure of the dough.

Prospects and challenges: The integration of such BKM requires the upgrading of computer tools. Their use for designing cereal food products requires (a) to integrate the available know-how and expertise for specific process operations where the use of models based on differential equations is still difficult, and (b) to extend the existing models to a domain of composition, in order to cope for the necessary increase in dietary fibre, that will modify greatly the rheological behaviour of dough. The use of such integrated models may be thought of to design products with target nutritional and sensory properties, provided their porosity and cellular structure are precisely characterized, and defined the pathways for reaching them, according to the so-called reverse engineering approach.

WP6-Key result 1- Acting on micro- and macrostructure of meat, cooking conditions can modify the nutritional potential of meat proteins
In addition to the composition in amino acids and the digestibility, new criteria are appearing to fully describe the nutritional potential of proteins: the kinetics of amino acid absorption, and the potential to release bioactive peptides during digestion. For meat, these parameters can be modified by cooking conditions, which act on the micro- and macrostructure of the product. Drastic cooking conditions lead to protein aggregation, which slows enzymatic digestion of proteins. However, this effect, related to the microstructure, is limited in comparison to the effect of the macrostructure, and the chewing efficiency of the consumer. Furthermore, a simple difference in cooking conditions significantly modified the postprandial plasma peptide profile of the consumer, and therefore the potential health effect of meat.

Research aim and background: The classic criteria for evaluating the quality of a protein source are based on amino acid composition, and protein digestibility in the digestive tract. It is now known that these basic criteria are not sufficient to fully describe the nutritional potential of a protein. For instance, it has been shown that the rate of protein digestion can regulate postprandial protein retention (Dangin et al., 2003). Thus, the ranking of protein sources according to their rate of digestion is necessary. Additionally, total digestibility is not a good predictor of amino acid bioavailability Indeed, only digestion in the small intestine is thought to supply amino acids to the body. Finally, all dietary proteins are potential sources of peptides with beneficial health effects.
Previous work on meat shows that modifications at the microscopic scale of the protein cellular network, for instance by protein aggregation during heat treatments, can slow down the enzymatic digestion of proteins (Bax et al., 2013). The rate of meat protein digestion can also be modulated by chewing efficiency (Rémond et al., 2007), probably in conjunction with meat macrostructure. However, the hierarchy between the effects on digestion parameters of the meat micro and macrostructure and mastication remain difficult to establish.
Furthermore, it has been shown that meat digestion reproducibly releases peptides containing amino acid sequences with antihypertensive activity (Bauchart et al., 2007a), and that a significant amount of carnosine is released in blood after a meat meal (Bauchart et al., 2007b). This dipeptide has numerous health benefits such as prevention of pathologies related to oxidative damage, or protein glycosylation. Nothing was known on the effect of the meat structure (micro and macro) on the bioavailability of meat derived peptides.
In this context we used in vitro and in vivo approaches (using minipigs and rats as animal models) to investigate the effect of structural modifications, on the bioavailability (kinetics and quantity) of amino acids and peptides by manipulating meat cooking conditions.

Results and applications: We have shown that cooking pork meat for 45 min at 90°C accelerates protein digestion, compared to a 10 min cooking at 75°C, but has no effect on the total bioavailability of amino acids. Although these results underlined the importance of the structure of the ingested meat in the determination of amino acid absorption kinetics, they seem to be in disagreement with a previous study with beef , according to which meat protein digestion was slowed down by increasing cooking temperature from 75°C (30 min) to 95°C (30 min) (Bax et al., 2012). Beside a possible meat origin effect, this suggests that, more than the temperature, the time-temperature couple is important in the determination of protein digestion rate. Furthermore, in the beef study, meat was minced before meal serving, whereas in our study, meat was only sliced. An interaction between cooking conditions, and the efficiency of oral and gastric meat degradation, could therefore also explain the apparent discrepancy between the 2 studies. Indeed in the present study, prolonged cooking at 90°C produced a meat whose structure was much more sensitive to mechanical degradation in the mouth, leading to increased digestion rate.
Cooking conditions did not modify meat carnosine content, and in both animal models (rats and minipigs), the carnosine bioavailability was not affected by cooking.
In order to address the question of peptide bioavailability, we have developed an analytical approach, using an LTQ-Orbitrap Velos mass spectrometer and data extraction with XCMS, for the characterization of the plasma peptidome and the identification of selected peptides. The method developed enabled us to discriminate the plasma peptidome of the minipigs fed with the pork meat cooked according to the 2 conditions. We identified up to 33 peptides discriminating the cooking conditions. The identification of these peptides was validated by analysis of their fragmentation. Furthermore the kinetics analysis allowed us to show the postprandial trajectory of the plasma peptidome.

Significance and benefits: Our work highlighted the difficulty of predicting protein digestion rate. Although it is clear that protein aggregation slows down the accessibility of digestive enzymes to their cleavage site within the proteins. This parameter appears to have minor effect when meat is not ground. Indeed in this case, the resistance of the meat structure to disruption by chewing activity could be the main determinant of the digestion speed. As it is attributable to the chewing efficiency of the consumer, it seems difficult to predict it only from meat derived measurements (chemical composition, structure characterization, and in vitro digestion).
We have clearly evidenced that for a given meat, a simple difference in cooking conditions significantly modifies the postprandial plasma peptidome of the consumer, and therefore the potential health benefit of meat proteins.

Prospect and challenges: This work is continued through the study, in human, of the interaction between meat structure (cohesiveness and tenderness), and chewing activity, in order to assess the final impact on the level of degradation of the swallowed bolus, and the consequences on protein digestion rate and the peptide release. If we identify peptides that are reproducibly released from meat proteins, the next challenge will be to identify their potential biological activity (antihypertensive, immunomodulator, antioxidant, etc)

WP6-Key result 2-Industrial concerns and needs towards fungal risk assessment
Alternaria species are reported to be the most commonly fungi affecting either tomato fruit and plant causing the so called black mould of tomato. Rapid infection of Alternaria in tomato may occur on the crop or post-harvest yielding high economical loss due to spoilage of industrialized products such as tomato purée. Moreover, under specific growth conditions, Alternaria spp. may also produce various mycotoxins which represent a serious risk for human consumption of tomato-based products. Within the frame of this collaboration, the ability to grow and produce mycotoxins were determined as a function of tomato purée pH and temperature of storage. These boundaries and growth simulations will help food industrials to further optimize tomato-based food formulation and shelf-life.

Research aims and background: Predictive modelling and microbial risk assessment have emerged as a comprehensive and systematic approach for addressing the risk of microbial pathogens and spoilers in specific foods and processes. Within WP6.4 food shelf-life or the impact of physico-chemical factors (pH, aw, Temperature) on microbial behavior were determined after (i) artificial inoculation of microorganism of interest in developed model food, (ii) fitting of experimental kinetics in food and (iii) prediction of shelf-life for various scenarii in static or dynamic conditions of storage. Challenge tests in DREAM model foods were performed according to guidelines and standardized protocols based on the current NF V01-009 on guidelines to conduct microbial challenge tests. Experiment fitting and shelf-life determination were performed with recognized mathematical models available in Sym’Previus decision making tool ( While simulation with a large variety of characterized strains of pathogenic bacteria is possible, few spoilage fungal strains are available in the database.
Similarly to the modelling approach reported by Huchet et al. 2013 predicting mould appearance time on pastries, an adaptation of the gamma concept of Zwietering (1992) was used to characterize fungal strain and to model its growth. In tomato-based products, Alternaria alternata represents a relevant microbial hazard since it is responsible for black mould spoilage yielding high economic losses (Bottalico & Logrieco, 1992). In order to model the effect of environmental factors on the development of mould on tomato medium, deep characterization of the strain was performed for a wide range of temperature and pH values at a given water activity of 0.99. Moreover, to further screen the conditions yielding to mycotoxin production, a similar experimental set up was used and analysed.

Results and applications: A toxigenic isolate of Alternaria alternata (ITEM8176) isolated from tomato fruit affected by black mould (Somma et al., 2011) and deposited at the ISPA collection, Italy (ITEM accession: was used for growth and mycotoxin production assessment.
Growth ability and mycotoxin production of this strain were determined on similar samples, after inoculating fungal ascospores (7day-old culture) on WP2 cold break tomato purée supplemented with agar. Adequate controls and a complete factorial design with a total of 6 levels of pH (2, 3, 4, 5, 6 and 7) and 10 levels of temperature (6.5 10, 15, 20, 25, 30, 35 and 37°C) were used (60 conditions in total), for 3 replicates, to define which conditions of pH and temperature supported (i) fungal development and (ii) mycotoxin production. Growth ability was determined on tomato-based medium by regular observations of fungal development and additional experimental data in broth were performed to determine the strain cardinal values, i.e. minimal, optimal and maximal values of pH and temperature allowing growth. When plates were covered by the fungus, mycotoxin quantification (tenuazonic acid, TeA; alternariol, AOH and alternariol methyl ether, AME) was performed by HPLC coupled with UV/DAD detection according to an adapted protocol from Solfrizzo et al. 2004.
After a maximum incubation time of 1 month, growth was observed in 35 conditions. A pH 3 was the lower pH values enabling growth. Lower and higher values of temperature allowing growth were further validated with experimental data. Among the investigated mycotoxins, only TeA was produced by Alternaria alternata ITEM8176 in studied conditions.
Interestingly, growth optimum (24.5°C, pH5.5) and mycotoxin optimum production (15°C, pH3.5) occurred for opposite conditions suggesting a strong impact of stress conditions on the strain virulence that may represent a health issue for non-adapted conditions of storage.

Significance and benefits: Even though further characterizations are needed to predict mycotoxin production in other physico-chemical conditions, the growth of Alternaria black mould spoilage on tomato-based food products can be now predicted for static and dynamic conditions of storage. These results indicate the combination of pH and temperature allowing Alternaria mould development and mycotoxin production in tomato-based products. Knowing these boundaries will help industrials to identify and control microbial hazards targeting either food spoilage development (food with neutral pH) or mycotoxin synthesis (acid food or juice) to further optimize tomato product formulation and storage conditions to limit mould and mycotoxin development during shelf-life.

Successful applications: To our knowledge this is the first time that boundaries representing physico-chemical conditions yielding growth and mycotoxin production of Alternaria black mould spoilage were defined on tomato-based products. These results were obtained thanks to a fruitful collaboration with complementary expertise and know how. These data are brand new and are currently being disseminated towards public, scientific and food industrial audience for future successful applications.

Prospects and challenges: Alternaria alternata is the major microbial contaminant causing tomato black spoilage yielding high economic losses. Thanks to these results the prediction of spoilage development is now possible for static or dynamic conditions of pH and temperature on tomato medium. Future work could address (i) validation for various recipe of tomato-based products, (ii) characterization of a larger number of strains to take into account biodiversity in growth prediction or (iii) further characterization of mycotoxin synthesis to predict health issue for various industrial relevant conditions.

WP7-key result: Industry guide for Food Modelling
This guideline can be used as a manual, in which the potential user can find advice on questions related to the use of specific models and also for general considerations on the application and design of food models.
The content of the guideline, including the descriptions of the models, is based mainly on the models that were developed within the DREAM FP7 project and moreover on some models outside the DREAM project that are actually available and frequently used in the food industry. Brief descriptions of some general examples of successful practices, and hints for avoiding typical traps and failures are summarized in the guideline.

Research aims and background: To support the practical application of realistic food models a Practical Guideline on use of models was developed. The main objective of the guideline is to provide an overview on different food models and on modelling tools/software for the potential users.
Models can be used in many different activities in the food sector considering the complexity of the food and their different applications. As changes in needs and requirements related to food products arise increasingly quickly and frequently, dissemination of available and effective models to the food industry and also to all the sectors who deal with food is therefore of high importance.
The target audience of this guideline includes several stakeholders of the food sector, particularly representatives of the food industry (including SMEs) and R&D teams, and decision makers on food safety, quality and nutritional questions. It is also recommended for food safety and regulatory bodies, nutritionists, food scientists and for marketing specialists who are particularly responsible for the industrial development of food companies. Food models are useful tools for product and process development, for the assessment of the safety of product/process design, and can help in understanding the impact of process parameters on final characteristics of the food and yield. However, their use requires an appropriate level of expertise, competence, skills and clear practical guidance.

Results and applications: Although there is a wide range of models having different scope, the model development process is typically divided into five phases:
i) Defining the goal of the model: developing a statement of purpose
ii) Designing and developing the model
iii) Practical testing and verification
iv) Making the model available for the audience
v) Maintenance of the model
This guideline describes these five phases as a systematic procedure and provides a brief description of those steps that are essential to be considered during the model development.
The most important facilities and requirements for application and operation of the models are discussed to help their use and to raise awareness among potential users. The model descriptions are grouped by four major generic structure groups representing vegetable, dairy, meat and cereal products. Furthermore, there is an additional group for models with general applicability, in which these models are discussed according to their function, such as predictive microbial models and heat treatment models.

Significance and benefits: One of the main advantages of the realistic food models is to mimic the behaviour of real food products. Furthermore, models can predict the impact of changes to the ingredients, compositions and process parameters. Thus, they can reduce the number of necessary experiments in real conditions, which is particularly important in the case of experiments in factory environments. The use of models can save time and can reduce costs. Standardised physical modelling materials and calculations with mathematical models provide a more reproducible benchmark for the impact of different treatments on food properties than experiments with real foods.
Because of the rapidly changing conditions and demands of the market, for the food producers it is required to have a safe but fast process for development. As experiments under real conditions can be expensive and time-consuming, frequently there are significant limitations for carrying out a large number of experiments under such conditions. In these cases, food models and software models can definitely be good tools to screen options at low cost and to enable experiments with real foods to focus on the most promising test parameters.
By using models, waste of the valuable real food product can be significantly reduced during the experiments.

Prospects and challenges: Due to the high competition in the food market, effective and quick product and process development procedures should be applied by food businesses. New product development can be the right solution either for improving quality or reducing cost. The product development process should be based on up to date knowledge of the consumers’ needs and expectations.
However, finding satisfactory answers is a time consuming process. Models as time-saving and cost-effective tools provide fairly good support when a company decides to develop a new product: food models can help to reduce the time needed to provide an initial protocol for a production process, and mathematical models can support the simulation of different processes and the changes of the parameters.

Potential Impact:
DREAM had several significants impacts:
- On the differents skaholders:
• On the European food industry competitiveness and sustainability.
• On consumers who are increasingly seeking tasteful foods which are at the same time safe and healthy, prepared in a sustainable and ethical manner and whose prices are affordable.
• On the society which will benefit from better selection of raw materials, thereby limiting losses due to wastage leading to decreases in cost price and impact on the environment.
• On regulatory bodies harmonising its activities with organisations already active in the area.

- On new knowledge for food nutrition and quality: Within this project a novel mathematical modelling approach was developed to describe the relationship between initial food properties and processing conditions on the one hand and the final food properties on the other hand.
- On standardised food models: The project initially delivered a range of prototype model foods covering four major food categories.
- On availability of the models: All of the models developed by DREAM have been made available to all stakeholders on a free basis.
- On science at large: Information and best practices were delivered to partner organisations and the wider research community.

The major dissemination event of DREAM project is the DREAM project international conference “From Model Food to Food Model” held from 24-26 June 2013 in Nantes, France (See the links of the round table discussions: and interviews: Some of the lectures done during this conference have been published on a special issue of Innovative Food Science and Emerging Technologies (IFSET). The DREAM book of results will be released on December 2013 and disseminated to relevant stakeholders and end-users and is available at this link:
From the beginning of the project, 42 peer review papers have been published, 25 peer review papers have been submitted and 10 manuscripts are in preparation. 115 oral communications and 96 posters have been presented. 31 newsletters have presented the project.
10 PhD students have been involved in the DREAM project: 5 PhD students at INRA on WP1, WP2, WP3, WP4, WP5, 1 PhD student between INRA and SOREDAB on WP4, 1 PhD student at WUR on WP2/WP1, 3 PhD students at CNR-ISPA on WP6.
Trainings for the national industry have been done: CCHU organised industry working party (3 sessions) in Budapest, Hungary (29/11/2012, 27/03/2013, 06/05/2013) for the representatives of the Hungarian food industry. CBRI organised presentations in Chipping Campden, UK on DREAM to five Campden BRI technical panels (22/01/2013, 23/01/2013, 29/01/2013, 30/01/2013, 14/02/2013) where members of Campden BRI and generally people in technical roles in food companies were participated. ACTILAIT organised 2 workshops in Rennes, France (23/01/2013, 17/03/2013) to experts from ACTILAIT and to dairy technology centres for presenting the cheese models. International demonstration trainings have been done in Budapest, Hungary (26/02/2013) as a Pre-Conference Workshop of the 4th MoniQA International Conference. The participants were mainly from the MoniQA International Conference, from the scientific community. International train the trainer workshop in Brussels, Belgium, has been done (03/09/2013) for the representatives of the National Technology Platforms (NTPs) of the ETP Food for Life.

List of Websites:
The project public website address is the following:

DREAM international conference:
- Round tables discussions :
- Interviews:

Relevant contact details
DREAM Coordinator:
Dr. Monique Axelos
Institut National de la Recherche
Agronomique (INRA)
Rue de la Géraudière, BP 71627
44316 Nantes Cedex 03, France
Tel: +33 (0) 2 40 67 51 45
Fax: +33 (0) 2 40 67 50 06

DREAM Project Manager:
Caroline Sautot
INRA Transfert (IT)
Bâtiment Chézine
Rue de la Géraudière, BP 71627
44316 Nantes Cedex 3, France
Tel: + 33 (0) 2 40 67 51 09
Fax: + 33 (0) 2 40 67 51 29