Search results for: Luca Marangoni
43 Decision Making Approach through Generalized Fuzzy Entropy Measure
Authors: H. D. Arora, Anjali Dhiman
Abstract:
Uncertainty is found everywhere and its understanding is central to decision making. Uncertainty emerges as one has less information than the total information required describing a system and its environment. Uncertainty and information are so closely associated that the information provided by an experiment for example, is equal to the amount of uncertainty removed. It may be pertinent to point out that uncertainty manifests itself in several forms and various kinds of uncertainties may arise from random fluctuations, incomplete information, imprecise perception, vagueness etc. For instance, one encounters uncertainty due to vagueness in communication through natural language. Uncertainty in this sense is represented by fuzziness resulting from imprecision of meaning of a concept expressed by linguistic terms. Fuzzy set concept provides an appropriate mathematical framework for dealing with the vagueness. Both information theory, proposed by Shannon (1948) and fuzzy set theory given by Zadeh (1965) plays an important role in human intelligence and various practical problems such as image segmentation, medical diagnosis etc. Numerous approaches and theories dealing with inaccuracy and uncertainty have been proposed by different researcher. In the present communication, we generalize fuzzy entropy proposed by De Luca and Termini (1972) corresponding to Shannon entropy(1948). Further, some of the basic properties of the proposed measure were examined. We also applied the proposed measure to the real life decision making problem.Keywords: entropy, fuzzy sets, fuzzy entropy, generalized fuzzy entropy, decision making
Procedia PDF Downloads 45042 Saccharification and Bioethanol Production from Banana Pseudostem
Authors: Elias L. Souza, Noeli Sellin, Cintia Marangoni, Ozair Souza
Abstract:
Among the different forms of reuse and recovery of agro-residual waste is the production of biofuels. The production of second-generation ethanol has been evaluated and proposed as one of the technically viable alternatives for this purpose. This research work employed the banana pseudostem as biomass. Two different chemical pre-treatment methods (acid hydrolisis with H2SO4 2% w/w and alkaline hydrolysis with NaOH 3% w/w) of dry and milled biomass (70 g/L of dry matter, ms) were assessed, and the corresponding reducing sugars yield, AR, (YAR), after enzymatic saccharification, were determined. The effect on YAR by increasing the dry matter (ms) from 70 to 100 g/L, in dry and milled biomass and also fresh, were analyzed. Changes in cellulose crystallinity and in biomass surface morphology due to the different chemical pre-treatments were analyzed by X-ray diffraction and scanning electron microscopy. The acid pre-treatment resulted in higher YAR values, whether related to the cellulose content under saccharification (RAR = 79,48) or to the biomass concentration employed (YAR/ms = 32,8%). In a comparison between alkaline and acid pre-treatments, the latter led to an increase in the cellulose content of the reaction mixture from 52,8 to 59,8%; also, to a reduction of the cellulose crystallinity index from 51,19 to 33,34% and increases in RAR (43,1%) and YAR/ms (39,5%). The increase of dry matter (ms) bran from 70 to 100 g/L in the acid pre-treatment, resulted in a decrease of average yields in RAR (43,1%) and YAR/ms (18,2%). Using the pseudostem fresh with broth removed, whether for 70 g/L concentration or 100 g/L in dry matter (ms), similarly to the alkaline pre-treatment, has led to lower average values in RAR (67,2% and 42,2%) and in YAR/ms (28,4% e 17,8%), respectively. The acid pre-treated and saccharificated biomass broth was detoxificated with different activated carbon contents (1,2 and 4% w/v), concentrated up to AR = 100 g/L and fermented by Saccharomyces cerevisiae. The yield values (YP/AR) and productivity (QP) in ethanol were determined and compared to those values obtained from the fermentation of non-concentrated/non-detoxificated broth (AR = 18 g/L) and concentrated/non-detoxificated broth (AR = 100 g/L). The highest average value for YP/AR (0,46 g/g) was obtained from the fermentation of non-concentrated broth. This value did not present a significant difference (p<0,05) when compared to the YP/RS related to the broth concentrated and detoxificated by activated carbon 1% w/v (YP/AR = 0,41 g/g). However, a higher ethanol productivity (QP = 1,44 g/L.h) was achieved through broth detoxification. This value was 75% higher than the average QP determined using concentrated and non-detoxificated broth (QP = 0,82 g/L.h), and 22% higher than the QP found in the non-concentrated broth (QP = 1,18 g/L.h).Keywords: biofuels, biomass, saccharification, bioethanol
Procedia PDF Downloads 34341 Numerical Study of the Breakdown of Surface Divergence Based Models for Interfacial Gas Transfer Velocity at Large Contamination Levels
Authors: Yasemin Akar, Jan G. Wissink, Herlina Herlina
Abstract:
The effect of various levels of contamination on the interfacial air–water gas transfer velocity is studied by Direct Numerical Simulation (DNS). The interfacial gas transfer is driven by isotropic turbulence, introduced at the bottom of the computational domain, diffusing upwards. The isotropic turbulence is generated in a separate, concurrently running the large-eddy simulation (LES). The flow fields in the main DNS and the LES are solved using fourth-order discretisations of convection and diffusion. To solve the transport of dissolved gases in water, a fifth-order-accurate WENO scheme is used for scalar convection combined with a fourth-order central discretisation for scalar diffusion. The damping effect of the surfactant contamination on the near surface (horizontal) velocities in the DNS is modelled using horizontal gradients of the surfactant concentration. An important parameter in this model, which corresponds to the level of contamination, is ReMa⁄We, where Re is the Reynolds number, Ma is the Marangoni number, and We is the Weber number. It was previously found that even small levels of contamination (ReMa⁄We small) lead to a significant drop in the interfacial gas transfer velocity KL. It is known that KL depends on both the Schmidt number Sc (ratio of the kinematic viscosity and the gas diffusivity in water) and the surface divergence β, i.e. K_L∝√(β⁄Sc). Previously it has been shown that this relation works well for surfaces with low to moderate contamination. However, it will break down for β close to zero. To study the validity of this dependence in the presence of surface contamination, simulations were carried out for ReMa⁄We=0,0.12,0.6,1.2,6,30 and Sc = 2, 4, 8, 16, 32. First, it will be shown that the scaling of KL with Sc remains valid also for larger ReMa⁄We. This is an important result that indicates that - for various levels of contamination - the numerical results obtained at low Schmidt numbers are also valid for significantly higher and more realistic Sc. Subsequently, it will be shown that - with increasing levels of ReMa⁄We - the dependency of KL on β begins to break down as the increased damping of near surface fluctuations results in an increased damping of β. Especially for large levels of contamination, this damping is so severe that KL is found to be underestimated significantly.Keywords: contamination, gas transfer, surfactants, turbulence
Procedia PDF Downloads 30040 Procedure for Impact Testing of Fused Recycled Glass
Authors: David Halley, Tyra Oseng-Rees, Luca Pagano, Juan A Ferriz-Papi
Abstract:
Recycled glass material is made from 100% recycled bottle glass and consumes less energy than re-melt technology. It also uses no additives in the manufacturing process allowing the recycled glass material, in principal, to go back to the recycling stream after end-of-use, contributing to the circular economy with a low ecological impact. The aim of this paper is to investigate the procedure for testing the recycled glass material for impact resistance, so it can be applied to pavements and other surfaces which are at risk of impact during service. A review of different impact test procedures for construction materials was undertaken, comparing methodologies and international standards applied to other materials such as natural stone, ceramics and glass. A drop weight impact testing machine was designed and manufactured in-house to perform these tests. As a case study, samples of the recycled glass material were manufactured with two different thicknesses and tested. The impact energy was calculated theoretically, obtaining results with 5 and 10 J. The results on the material were subsequently discussed. Improvements on the procedure can be made using high speed video technology to calculate velocity just before and immediately after the impact to know the absorbed energy. The initial results obtained in this procedure were positive although repeatability needs to be developed to obtain a correlation of results and finally be able to validate the procedure. The experiment with samples showed the practicality of this procedure and application to the recycled glass material impact testing although further research needs to be developed.Keywords: construction materials, drop weight impact, impact testing, recycled glass
Procedia PDF Downloads 29639 A Mathematical Investigation of the Turkevich Organizer Theory in the Citrate Method for the Synthesis of Gold Nanoparticles
Authors: Emmanuel Agunloye, Asterios Gavriilidis, Luca Mazzei
Abstract:
Gold nanoparticles are commonly synthesized by reducing chloroauric acid with sodium citrate. This method, referred to as the citrate method, can produce spherical gold nanoparticles (NPs) in the size range 10-150 nm. Gold NPs of this size are useful in many applications. However, the NPs are usually polydisperse and irreproducible. A better understanding of the synthesis mechanisms is thus required. This work thoroughly investigated the only model that describes the synthesis. This model combines mass and population balance equations, describing the NPs synthesis through a sequence of chemical reactions. Chloroauric acid reacts with sodium citrate to form aurous chloride and dicarboxy acetone. The latter organizes aurous chloride in a nucleation step and concurrently degrades into acetone. The unconsumed precursor then grows the formed nuclei. However, depending on the pH, both the precursor and the reducing agent react differently thus affecting the synthesis. In this work, we investigated the model for different conditions of pH, temperature and initial reactant concentrations. To solve the model, we used Parsival, a commercial numerical code, whilst to test it, we considered various conditions studied experimentally by different researchers, for which results are available in the literature. The model poorly predicted the experimental data. We believe that this is because the model does not account for the acid-base properties of both chloroauric acid and sodium citrate.Keywords: citrate method, gold nanoparticles, Parsival, population balance equations, Turkevich organizer theory
Procedia PDF Downloads 20338 MIMIC: A Multi Input Micro-Influencers Classifier
Authors: Simone Leonardi, Luca Ardito
Abstract:
Micro-influencers are effective elements in the marketing strategies of companies and institutions because of their capability to create an hyper-engaged audience around a specific topic of interest. In recent years, many scientific approaches and commercial tools have handled the task of detecting this type of social media users. These strategies adopt solutions ranging from rule based machine learning models to deep neural networks and graph analysis on text, images, and account information. This work compares the existing solutions and proposes an ensemble method to generalize them with different input data and social media platforms. The deployed solution combines deep learning models on unstructured data with statistical machine learning models on structured data. We retrieve both social media accounts information and multimedia posts on Twitter and Instagram. These data are mapped into feature vectors for an eXtreme Gradient Boosting (XGBoost) classifier. Sixty different topics have been analyzed to build a rule based gold standard dataset and to compare the performances of our approach against baseline classifiers. We prove the effectiveness of our work by comparing the accuracy, precision, recall, and f1 score of our model with different configurations and architectures. We obtained an accuracy of 0.91 with our best performing model.Keywords: deep learning, gradient boosting, image processing, micro-influencers, NLP, social media
Procedia PDF Downloads 18337 Non-Destructive Testing of Selective Laser Melting Products
Authors: Luca Collini, Michele Antolotti, Diego Schiavi
Abstract:
At present, complex geometries within production time shrinkage, rapidly increasing demand, and high-quality standard requirement make the non-destructive (ND) control of additively manufactured components indispensable means. On the other hand, a technology gap and the lack of standards regulating the methods and the acceptance criteria indicate the NDT of these components a stimulating field to be still fully explored. Up to date, penetrant testing, acoustic wave, tomography, radiography, and semi-automated ultrasound methods have been tested on metal powder based products so far. External defects, distortion, surface porosity, roughness, texture, internal porosity, and inclusions are the typical defects in the focus of testing. Detection of density and layers compactness are also been tried on stainless steels by the ultrasonic scattering method. In this work, the authors want to present and discuss the radiographic and the ultrasound ND testing on additively manufactured Ti₆Al₄V and inconel parts obtained by the selective laser melting (SLM) technology. In order to test the possibilities given by the radiographic method, both X-Rays and γ-Rays are tried on a set of specifically designed specimens realized by the SLM. The specimens contain a family of defectology, which represent the most commonly found, as cracks and lack of fusion. The tests are also applied to real parts of various complexity and thickness. A set of practical indications and of acceptance criteria is finally drawn.Keywords: non-destructive testing, selective laser melting, radiography, UT method
Procedia PDF Downloads 14736 Phytochemical Evaluation and In-Vitro Antibacterial Activity of Ethanolic Extracts of Moroccan Lavandula x Intermedia Leaves and Flowers
Authors: Jamila Fliou, Federica Spinola, Ouassima Riffi, Asmaa Zriouel, Ali Amechrouq, Luca Nalbone, Alessandro Giuffrida, Filippo Giarratana
Abstract:
This study performed a preliminary evaluation of the phytochemical composition and in vitro antibacterial activity of ethanolic extracts of Lavandula x intermedia leaves and flowers collected in the Fez-Meknes region of Morocco. Phytochemical analyses comprised qualitative colourimetric determinations of alkaloids, anthraquinones, and terpenes and quantitative analysis of total polyphenols, flavonoids, and condensed tannins by UV spectrophotometer. Antibacterial activity was evaluated by determining minimum inhibitory concentration (MIC) and minimum bactericidal concentration (MBC) values against different ATCC bacterial strains. The phytochemical analysis showed a high amount of total polyphenols, flavonoids, and tannins in the leaf extract and a higher amount of terpenes based on colourimetric reaction than the flower extract. A positive colourimetric reaction for alkaloids and anthraquinones was detected for both extracts. The antibacterial activity of leaves and flower extract was not different against Gram-positive and Gram-negative strains (p<0.05). The results of the present study suggest the possible use of ethanolic extracts of L. x intermedia collected in the Fez-Meknes region of Morocco as a natural agent against bacterial pathogens.Keywords: antimicrobial activity, Lavandula spp., lavender, lavandin, UV spectrophotometric analysis
Procedia PDF Downloads 6835 Crossing Narrative Waters in World Cinema: Alamar (2009) and Kaili Blues (2015)
Authors: Dustin Dill
Abstract:
The physical movement of crossing over water points to both developing narrative tropes and innovative cinematography in World Cinema today. Two prime examples, Alamar (2009) by Pedro González-Rubio and Kaili Blues (2015) by Bi Gan, demonstrate how contemporary storytelling in a film not only rests upon these water shots but also emerges from them. The range of symbolism that these episodes in the story provoke goes hand in hand with the diverse filming sequences found in the respective productions. While González-Rubio decides to cut the scene into long and longer shots, Gan uses a single take. The differing angles depict equally unique directors and film projects: Alamar runs parallel to many definitions of the essay film, and Kaili Blues resonates much more with mystery and art film. Nonetheless, the crossing of water scenes influence the narratives’ subjects despite the generic consequences, and it is within the essay, mystery, and art film genres which allows for a better understanding of World Cinema. Tiago de Luca explains World Cinema’s prerogative of giving form to a certain type of spectator does not always line up. Given the immense number of interpretations of crossing water —the escape from suffering to find nirvana, rebirth, and colonization— underline the difficulty of categorizing it. If before this type of cross-genre was a trait that defined World Cinema in its beginning, this study observes that González-Rubio and Gan question the all-encompassing genre with their experimental shots of a universal narrative trope, the crossing of water.Keywords: cinematography, genre, narrative, world cinema
Procedia PDF Downloads 29334 Understanding Tourism Innovation through Fuzzy Measures
Authors: Marcella De Filippo, Delio Colangelo, Luca Farnia
Abstract:
In recent decades, the hyper-competition of tourism scenario has implicated the maturity of many businesses, attributing a central role to innovative processes and their dissemination in the economy of company management. At the same time, it has defined the need for monitoring the application of innovations, in order to govern and improve the performance of companies and destinations. The study aims to analyze and define the innovation in the tourism sector. The research actions have concerned, on the one hand, some in-depth interviews with experts, identifying innovation in terms of process and product, digitalization, sustainability policies and, on the other hand, to evaluate the interaction between these factors, in terms of substitutability and complementarity in management scenarios, in order to identify which one is essential to be competitive in the global scenario. Fuzzy measures and Choquet integral were used to elicit Experts’ preferences. This method allows not only to evaluate the relative importance of each pillar, but also and more interestingly, the level of interaction, ranging from complementarity to substitutability, between pairs of factors. The results of the survey are the following: in terms of Shapley values, Experts assert that Innovation is the most important factor (32.32), followed by digitalization (31.86), Network (20.57) and Sustainability (15.25). In terms of Interaction indices, given the low degree of consensus among experts, the interaction between couples of criteria on average could be ignored; however, it is worth to note that the factors innovations and digitalization are those in which experts express the highest degree of interaction. However for some of them, these factors have a moderate level of complementarity (with a pick of 57.14), and others consider them moderately substitutes (with a pick of -39.58). Another example, although outlier is the interaction between network and digitalization, in which an expert consider them markedly substitutes (-77.08).Keywords: innovation, business model, tourism, fuzzy
Procedia PDF Downloads 27233 Microbial Dynamics and Sensory Traits of Spanish- and Greek-Style Table Olives (Olea europaea L. cv. Ascolana tenera) Fermented with Sea Fennel (Crithmum maritimum L.)
Authors: Antonietta Maoloni, Federica Cardinali, Vesna Milanović, Andrea Osimani, Ilario Ferrocino, Maria Rita Corvaglia, Luca Cocolin, Lucia Aquilanti
Abstract:
Table olives (Olea europaea L.) are among the most important fermented vegetables all over the world, while sea fennel (Crithmum maritimum L.) is an emerging food crop with interesting nutritional and sensory traits. Both of them are characterized by the presence of several bioactive compounds with potential beneficial health effects, thus representing two valuable substrates for the manufacture of innovative vegetable-based preserves. Given these premises, the present study was aimed at exploring the co-fermentation of table olives and sea fennel to produce new high-value preserves. Spanish style or Greek style processing method and the use of a multiple strain starter were explored. The preserves were evaluated for their microbial dynamics and key sensory traits. During the fermentation, a progressive pH reduction was observed. Mesophilic lactobacilli, mesophilic lactococci, and yeasts were the main microbial groups at the end of the fermentation, whereas Enterobacteriaceae decreased during fermentation. An evolution of the microbiota was revealed by metataxonomic analysis, with Lactiplantibacillus plantarum dominating in the late stage of fermentation, irrespective of processing method and use of the starter. Greek style preserves resulted in more crunchy and less fibrous than Spanish style one and were preferred by trained panelists.Keywords: lactic acid bacteria, Lactiplantibacillus plantarum, metataxonomy, panel test, rock samphire
Procedia PDF Downloads 12932 Critical Approach to Define the Architectural Structure of a Health Prototype in a Rural Area of Brazil
Authors: Domenico Chizzoniti, Monica Moscatelli, Letizia Cattani, Luca Preis
Abstract:
A primary healthcare facility in developing countries should be a multifunctional space able to respond to different requirements: Flexibility, modularity, aggregation and reversibility. These basic features could be better satisfied if applied to an architectural artifact that complies with the typological, figurative and constructive aspects of the context in which it is located. Therefore, the purpose of this paper is to identify a procedure that can define the figurative aspects of the architectural structure of the health prototype for the marginal areas of developing countries through a critical approach. The application context is the rural areas of the Northeast of Bahia in Brazil. The prototype should be located in the rural district of Quingoma, in the municipality of Lauro de Freitas, a particular place where there is still a cultural fusion of black and indigenous populations. Based on the historical analysis of settlement strategies and architectural structures in spaces of public interest or collective use, this paper aims to provide a procedure able to identify the categories and rules underlying typological and figurative aspects, in order to detect significant and generalizable elements, as well as materials and constructive techniques typically adopted in the rural areas of Brazil. The object of this work is therefore not only the recovery of certain constructive approaches but also the development of a procedure that integrates the requirements of the primary healthcare prototype with its surrounding economic, social, cultural, settlement and figurative conditions.Keywords: architectural typology, developing countries, local construction techniques, primary health care.
Procedia PDF Downloads 32431 Cold Spray High Entropy Alloy Coating Surface Microstructural Characterization and Mechanical Testing
Authors: Raffaella Sesana, Nazanin Sheibanian, Luca Corsaro, Sedat Özbilen, Rocco Lupoi, Francesco Artusio
Abstract:
High Entropy Alloy (HEA) coatings of Al0.1-0.5CoCrCuFeNi and MnCoCrCuFeNi on Mg substrates were prepared from mechanically alloyed HEA powder feedstocks and at three different Cold Spray (CS) process gas (N2) temperatures (650, 750 and 850°C). Mechanically alloyed and cold-sprayed HEA coatings were characterized by macro photography, OM, SEM+EDS study, micro-hardness testing, roughness, and porosity measurements. As a result of mechanical alloying (MA), harder particles are deformed and fractured. The particles in the Cu-rich region were coarser and more globular than those in the A1 phase, which is relatively soft and ductile. In addition to the A1 particles, there were some separate Cu-rich regions. Due to the brittle nature of the powder and the acicular shape, Mn-HEA powder exhibited a different trend with smaller particle sizes. It is observed that MA results in a loose structure characterized by many gaps, cracks, signs of plastic deformation, and small particles attached to the surface of the particle. Considering the experimental results obtained, it is not possible to conclude that the chemical composition of the high entropy alloy influences the roughness of the coating. It has been observed that the deposited volume increases with temperature only in the case of Al0.1 and Mg-based HEA, while for the rest of the Al-based HEA, there are no noticeable changes. There is a direct correlation between micro-hardness and the chemical composition of a coating: the micro-hardness of a coating increases as the percentage of aluminum increases in the sample. Compared to the substrate, the coating has a much higher hardness, and the hardness measured at the interface is intermediate.Keywords: characterisation, cold spraying, HEA coatings, SEM+EDS
Procedia PDF Downloads 6430 The Concept of Accounting in Islamic Transactions
Authors: Ahmad Abdulkadir Ibrahim
Abstract:
The Islamic law of transactions laid down the methods and instruments of accounting and analyzed its basic assumptions in the modern world. There is a need to examine the implications of accounting initiatives in the Muslim world and attempt to outline the important characteristics of Islamic accounting and how Islamic accounting resolves the problem of measuring the cost of Murabaha goods in case of exchange rate variation. The research tends to discuss an analytical approach to the Islamic accounting concept as well as elaborating the jurisprudential matter and practical aspects of accounting in Islamic financial transactions. It also aims to alert the practitioners of accounting in the Islamic world to be aware of the concept of accounting in Islamic jurisprudence and its historical development. The methodology adopted in this research is the qualitative method through the consultation of relevant literature, which focuses on the thematic study of the subject matter. This is followed by an analysis and discussion of the contents of the materials used. It is concluded that Islamic accounting is unique in its norms as it has been characterized by fairness, accuracy in measuring tools, truthfulness, mutual trust, moderation in making a profit, and tolerance. It was also qualified by capacity and flexibility in terms of the tools and terminology used and invented by Islamic jurisprudence in the accounting system, which indicates its validity and consistency anytime and anywhere. An important conclusion of the research also lies in the refutation of the popular idea that an Italian writer known as Luca Pacilio was the first writer who developed the basis of double-entry due to the presented proofs by Muslim scholars of critical accounting developments, which cannot be ignored. It concludes further that Islamic jurisprudence draws the accounting system codified in the foundations of a market that is far from usury, fraud, cheating, and unfair competition in all areas.Keywords: accounting, Islamic accounting, Islamic transactions, Islamic jurisprudence, double entry, murabaha, characteristics
Procedia PDF Downloads 6429 Data Compression in Ultrasonic Network Communication via Sparse Signal Processing
Authors: Beata Zima, Octavio A. Márquez Reyes, Masoud Mohammadgholiha, Jochen Moll, Luca de Marchi
Abstract:
This document presents the approach of using compressed sensing in signal encoding and information transferring within a guided wave sensor network, comprised of specially designed frequency steerable acoustic transducers (FSATs). Wave propagation in a damaged plate was simulated using commercial FEM-based software COMSOL. Guided waves were excited by means of FSATs, characterized by the special shape of its electrodes, and modeled using PIC255 piezoelectric material. The special shape of the FSAT, allows for focusing wave energy in a certain direction, accordingly to the frequency components of its actuation signal, which makes available a larger monitored area. The process begins when a FSAT detects and records reflection from damage in the structure, this signal is then encoded and prepared for transmission, using a combined approach, based on Compressed Sensing Matching Pursuit and Quadrature Amplitude Modulation (QAM). After codification of the signal is in binary chars the information is transmitted between the nodes in the network. The message reaches the last node, where it is finally decoded and processed, to be used for damage detection and localization purposes. The main aim of the investigation is to determine the location of detected damage using reconstructed signals. The study demonstrates that the special steerable capabilities of FSATs, not only facilitate the detection of damage but also permit transmitting the damage information to a chosen area in a specific direction of the investigated structure.Keywords: data compression, ultrasonic communication, guided waves, FEM analysis
Procedia PDF Downloads 12428 Clustering for Detection of the Population at Risk of Anticholinergic Medication
Authors: A. Shirazibeheshti, T. Radwan, A. Ettefaghian, G. Wilson, C. Luca, Farbod Khanizadeh
Abstract:
Anticholinergic medication has been associated with events such as falls, delirium, and cognitive impairment in older patients. To further assess this, anticholinergic burden scores have been developed to quantify risk. A risk model based on clustering was deployed in a healthcare management system to cluster patients into multiple risk groups according to anticholinergic burden scores of multiple medicines prescribed to patients to facilitate clinical decision-making. To do so, anticholinergic burden scores of drugs were extracted from the literature, which categorizes the risk on a scale of 1 to 3. Given the patients’ prescription data on the healthcare database, a weighted anticholinergic risk score was derived per patient based on the prescription of multiple anticholinergic drugs. This study was conducted on over 300,000 records of patients currently registered with a major regional UK-based healthcare provider. The weighted risk scores were used as inputs to an unsupervised learning algorithm (mean-shift clustering) that groups patients into clusters that represent different levels of anticholinergic risk. To further evaluate the performance of the model, any association between the average risk score within each group and other factors such as socioeconomic status (i.e., Index of Multiple Deprivation) and an index of health and disability were investigated. The clustering identifies a group of 15 patients at the highest risk from multiple anticholinergic medication. Our findings also show that this group of patients is located within more deprived areas of London compared to the population of other risk groups. Furthermore, the prescription of anticholinergic medicines is more skewed to female than male patients, indicating that females are more at risk from this kind of multiple medications. The risk may be monitored and controlled in well artificial intelligence-equipped healthcare management systems.Keywords: anticholinergic medicines, clustering, deprivation, socioeconomic status
Procedia PDF Downloads 21227 Applied Spatial Mapping and Monitoring of Illegal Landfills for Deprived Urban Areas in Romania
Authors: Șercăianu Mihai, Aldea Mihaela, Iacoboaea Cristina, Luca Oana, Nenciu Ioana
Abstract:
The rise and mitigation of unauthorized illegal waste dumps are a significant global issue within waste management ecosystems, impacting disadvantaged communities. Globally, including in Romania, many individuals live in houses without legal recognition, lacking ownership or construction permits, in areas known as "informal settlements." An increasing number of regions and cities in Romania are struggling to manage their illegal waste dumps, especially in the context of increasing poverty and lack of regulation related to informal settlements. One such informal settlement is located at the end of Bistra Street in Câlnic, within the Reșița Municipality of Caras Severin County. The article presents a case study that focuses on employing remote sensing techniques and spatial data to monitor and map illegal waste practices, with subsequent integration into a geographic information system tailored for the Reșița community. In addition, the paper outlines the steps involved in devising strategies aimed at enhancing waste management practices in disadvantaged areas, aligning with the shift toward a circular economy. Results presented in the paper contain a spatial mapping and visualization methodology calibrated with in situ data collection applicable for identifying illegal landfills. The emergence and neutralization of illegal dumps pose a challenge in the field of waste management. These approaches, which prove effective where conventional solutions have failed, need to be replicated and adopted more wisely.Keywords: informal settlements, GIS, waste dumps, waste management, monitoring
Procedia PDF Downloads 8826 Faster Pedestrian Recognition Using Deformable Part Models
Authors: Alessandro Preziosi, Antonio Prioletti, Luca Castangia
Abstract:
Deformable part models achieve high precision in pedestrian recognition, but all publicly available implementations are too slow for real-time applications. We implemented a deformable part model algorithm fast enough for real-time use by exploiting information about the camera position and orientation. This implementation is both faster and more precise than alternative DPM implementations. These results are obtained by computing convolutions in the frequency domain and using lookup tables to speed up feature computation. This approach is almost an order of magnitude faster than the reference DPM implementation, with no loss in precision. Knowing the position of the camera with respect to horizon it is also possible prune many hypotheses based on their size and location. The range of acceptable sizes and positions is set by looking at the statistical distribution of bounding boxes in labelled images. With this approach it is not needed to compute the entire feature pyramid: for example higher resolution features are only needed near the horizon. This results in an increase in mean average precision of 5% and an increase in speed by a factor of two. Furthermore, to reduce misdetections involving small pedestrians near the horizon, input images are supersampled near the horizon. Supersampling the image at 1.5 times the original scale, results in an increase in precision of about 4%. The implementation was tested against the public KITTI dataset, obtaining an 8% improvement in mean average precision over the best performing DPM-based method. By allowing for a small loss in precision computational time can be easily brought down to our target of 100ms per image, reaching a solution that is faster and still more precise than all publicly available DPM implementations.Keywords: autonomous vehicles, deformable part model, dpm, pedestrian detection, real time
Procedia PDF Downloads 28125 Advancing Urban Sustainability through Data-Driven Machine Learning Solutions
Authors: Nasim Eslamirad, Mahdi Rasoulinezhad, Francesco De Luca, Sadok Ben Yahia, Kimmo Sakari Lylykangas, Francesco Pilla
Abstract:
With the ongoing urbanization, cities face increasing environmental challenges impacting human well-being. To tackle these issues, data-driven approaches in urban analysis have gained prominence, leveraging urban data to promote sustainability. Integrating Machine Learning techniques enables researchers to analyze and predict complex environmental phenomena like Urban Heat Island occurrences in urban areas. This paper demonstrates the implementation of data-driven approach and interpretable Machine Learning algorithms with interpretability techniques to conduct comprehensive data analyses for sustainable urban design. The developed framework and algorithms are demonstrated for Tallinn, Estonia to develop sustainable urban strategies to mitigate urban heat waves. Geospatial data, preprocessed and labeled with UHI levels, are used to train various ML models, with Logistic Regression emerging as the best-performing model based on evaluation metrics to derive a mathematical equation representing the area with UHI or without UHI effects, providing insights into UHI occurrences based on buildings and urban features. The derived formula highlights the importance of building volume, height, area, and shape length to create an urban environment with UHI impact. The data-driven approach and derived equation inform mitigation strategies and sustainable urban development in Tallinn and offer valuable guidance for other locations with varying climates.Keywords: data-driven approach, machine learning transparent models, interpretable machine learning models, urban heat island effect
Procedia PDF Downloads 3924 Rumination Time and Reticuloruminal Temperature around Calving in Eutocic and Dystocic Dairy Cows
Authors: Levente Kovács, Fruzsina Luca Kézér, Ottó Szenci
Abstract:
Prediction of the onset of calving and recognizing difficulties at calving has great importance in decreasing neonatal losses and reducing the risk of health problems in the early postpartum period. In this study, changes of rumination time, reticuloruminal pH and temperature were investigated in eutocic (EUT, n = 10) and dystocic (DYS, n = 8) dairy cows around parturition. Rumination time was continuously recorded using an acoustic biotelemetry system, whereas reticuloruminal pH and temperature were recorded using an indwelling and wireless data transmitting system. The recording period lasted from 3 d before calving until 7 days in milk. For the comparison of rumination time and reticuloruminal characteristics between groups, time to return to baseline (the time interval required to return to baseline from the delivery of the calf) and area under the curve (AUC, both for prepartum and postpartum periods) were calculated for each parameter. Rumination time decreased from baseline 28 h before calving both for EUT and DYS cows (P = 0.023 and P = 0.017, respectively). After 20 h before calving, it decreased onwards to reach 32.4 ± 2.3 and 13.2 ± 2.0 min/4 h between 8 and 4 h before delivery in EUT and DYS cows, respectively, and then it decreased below 10 and 5 min during the last 4 h before calving (P = 0.003 and P = 0.008, respectively). Until 12 h after delivery rumination time reached 42.6 ± 2.7 and 51.0 ± 3.1 min/4 h in DYS and EUT dams, respectively, however, AUC and time to return to baseline suggested lower rumination activity in DYS cows than in EUT dams for the 168-h postpartum observational period (P = 0.012 and P = 0.002, respectively). Reticuloruminal pH decreased from baseline 56 h before calving both for EUT and DYS cows (P = 0.012 and P = 0.016, respectively), but did not differ between groups before delivery. In DYS cows, reticuloruminal temperature decreased from baseline 32 h before calving by 0.23 ± 0.02 °C (P = 0.012), whereas in EUT cows such a decrease was found only 20 h before delivery (0.48 ± 0.05 °C, P < 0.01). AUC of reticuloruminal temperature calculated for the prepartum period was greater in EUT cows than in DYS cows (P = 0.042). During the first 4 h after calving, it decreased from 39.7 ± 0.1 to 39.00 ± 0.1 °C and from 39.8 ± 0.1 to 38.8 ± 0.1 °C in EUT and DYS cows, respectively (P < 0.01 for both groups) and reached baseline levels after 35.4 ± 3.4 and 37.8 ± 4.2 h after calving in EUT and DYS cows, respectively. Based on our results, continuous monitoring of changes in rumination time and reticuloruminal temperature seems to be promising in the early detection of cows with a higher risk of dystocia. Depressed postpartum rumination time of DYS cows highlights the importance of the monitoring of cows experiencing difficulties at calving.Keywords: reticuloruminal pH, reticuloruminal temperature, rumination time, dairy cows, dystocia
Procedia PDF Downloads 31623 A Settlement Strategy for Health Facilities in Emerging Countries: A Case Study in Brazil
Authors: Domenico Chizzoniti, Monica Moscatelli, Letizia Cattani, Piero Favino, Luca Preis
Abstract:
A settlement strategy is to anticipate and respond the needs of existing and future communities through the provision of primary health care facilities in marginalized areas. Access to a health care network is important to improving healthcare coverage, often lacking, in developing countries. The study explores that a good sanitary system strategy of rural contexts brings advantages to an existing settlement: improving transport, communication, water and social facilities. The objective of this paper is to define a possible methodology to implement primary health care facilities in disadvantaged areas of emerging countries. In this research, we analyze the case study of Lauro de Freitas, a municipality in the Brazilian state of Bahia, part of the Metropolitan Region of Salvador, with an area of 57,662 km² and 194.641 inhabitants. The health localization system in Lauro de Freitas is an integrated process that involves not only geographical aspects, but also a set of factors: population density, epidemiological data, allocation of services, road networks, and more. Data were collected also using semi-structured interviews and questionnaires to the local population. Synthesized data suggest that moving away from the coast where there is the greatest concentration of population and services, a network of primary health care facilities is able to improve the living conditions of small-dispersed communities. Based on the health service needs of populations, we have developed a methodological approach that is particularly useful in rural and remote contexts in emerging countries.Keywords: healthcare, settlement strategy, urban health, rural
Procedia PDF Downloads 36822 Advancing Spatial Mapping and Monitoring of Illegal Landfills for Deprived Urban Areas in Romania
Authors: ȘercăIanu Mihai, Aldea Mihaela, Iacoboaea Cristina, Luca Oana, Nenciu Ioana
Abstract:
The emergence and neutralization of illegal waste dumps represent a global concern for waste management ecosystems with a particularly pronounced impact on disadvantaged communities. All over the world, and in this particular case in Romania, a relevant number of people resided in houses lacking any legal forms such as land ownership documents or building permits. These areas are referred to as “informal settlements”. An increasing number of regions and cities in Romania are struggling to manage their waste dumps, especially in the context of increasing poverty and lack of regulation related to informal settlements. An example of such informal settlement can be found at the terminus of Bistra Street in Câlnic, which falls under the jurisdiction of the Municipality of Reșița in Caras Severin County. The article presents a case study that focuses on employing remote sensing techniques and spatial data to monitor and map illegal waste practices, with subsequent integration into a geographic information system tailored for the Reșița community. In addition, the paper outlines the steps involved in devising strategies aimed at enhancing waste management practices in disadvantaged areas, aligning with the shift toward a circular economy. Results presented in the paper contain a spatial mapping and visualization methodology calibrated with in situ data collection applicable for identifying illegal landfills. The emergence and neutralization of illegal dumps pose a challenge in the field of waste management. These approaches, which prove effective where conventional solutions have failed, need to be replicated and adopted more wisely.Keywords: waste dumps, waste management, monitoring, GIS, informal settlements
Procedia PDF Downloads 8721 A Data-Driven Agent Based Model for the Italian Economy
Authors: Michele Catalano, Jacopo Di Domenico, Luca Riccetti, Andrea Teglio
Abstract:
We develop a data-driven agent based model (ABM) for the Italian economy. We calibrate the model for the initial condition and parameters. As a preliminary step, we replicate the Monte-Carlo simulation for the Austrian economy. Then, we evaluate the dynamic properties of the model: the long-run equilibrium and the allocative efficiency in terms of disequilibrium patterns arising in the search and matching process for final goods, capital, intermediate goods, and credit markets. In this perspective, we use a randomized initial condition approach. We perform a robustness analysis perturbing the system for different parameter setups. We explore the empirical properties of the model using a rolling window forecast exercise from 2010 to 2022 to observe the model’s forecasting ability in the wake of the COVID-19 pandemic. We perform an analysis of the properties of the model with a different number of agents, that is, with different scales of the model compared to the real economy. The model generally displays transient dynamics that properly fit macroeconomic data regarding forecasting ability. We stress the model with a large set of shocks, namely interest policy, fiscal policy, and exogenous factors, such as external foreign demand for export. In this way, we can explore the most exposed sectors of the economy. Finally, we modify the technology mix of the various sectors and, consequently, the underlying input-output sectoral interdependence to stress the economy and observe the long-run projections. In this way, we can include in the model the generation of endogenous crisis due to the implied structural change, technological unemployment, and potential lack of aggregate demand creating the condition for cyclical endogenous crises reproduced in this artificial economy.Keywords: agent-based models, behavioral macro, macroeconomic forecasting, micro data
Procedia PDF Downloads 6920 A Flexible Real-Time Eco-Drive Strategy for Electric Minibus
Authors: Felice De Luca, Vincenzo Galdi, Piera Stella, Vito Calderaro, Adriano Campagna, Antonio Piccolo
Abstract:
Sustainable mobility has become one of the major issues of recent years. The challenge in reducing polluting emissions as much as possible has led to the production and diffusion of vehicles with internal combustion engines that are less polluting and to the adoption of green energy vectors, such as vehicles powered by natural gas or LPG and, more recently, with hybrid and electric ones. While on the one hand, the spread of electric vehicles for private use is becoming a reality, albeit rather slowly, not the same is happening for vehicles used for public transport, especially those that operate in the congested areas of the cities. Even if the first electric buses are increasingly being offered on the market, it remains central to the problem of autonomy for battery fed vehicles with high daily routes and little time available for recharging. In fact, at present, solid-state batteries are still too large in size, heavy, and unable to guarantee the required autonomy. Therefore, in order to maximize the energy management on the vehicle, the optimization of driving profiles offer a faster and cheaper contribution to improve vehicle autonomy. In this paper, following the authors’ precedent works on electric vehicles in public transport and energy management strategies in the electric mobility area, an eco-driving strategy for electric bus is presented and validated. Particularly, the characteristics of the prototype bus are described, and a general-purpose eco-drive methodology is briefly presented. The model is firstly simulated in MATLAB™ and then implemented on a mobile device installed on-board of a prototype bus developed by the authors in a previous research project. The solution implemented furnishes the bus-driver suggestions on the guide style to adopt. The result of the test in a real case will be shown to highlight the effectiveness of the solution proposed in terms of energy saving.Keywords: eco-drive, electric bus, energy management, prototype
Procedia PDF Downloads 14219 Vortex Flows under Effects of Buoyant-Thermocapillary Convection
Authors: Malika Imoula, Rachid Saci, Renee Gatignol
Abstract:
A numerical investigation is carried out to analyze vortex flows in a free surface cylinder, driven by the independent rotation and differentially heated boundaries. As a basic uncontrolled isothermal flow, we consider configurations which exhibit steady axisymmetric toroidal type vortices which occur at the free surface; under given rates of the bottom disk uniform rotation and for selected aspect ratios of the enclosure. In the isothermal case, we show that sidewall differential rotation constitutes an effective kinematic means of flow control: the reverse flow regions may be suppressed under very weak co-rotation rates, while an enhancement of the vortex patterns is remarked under weak counter-rotation. However, in this latter case, high rates of counter-rotation reduce considerably the strength of the meridian flow and cause its confinement to a narrow layer on the bottom disk, while the remaining bulk flow is diffusion dominated and controlled by the sidewall rotation. The main control parameters in this case are the rotational Reynolds number, the cavity aspect ratio and the rotation rate ratio defined. Then, the study proceeded to consider the sensitivity of the vortex pattern, within the Boussinesq approximation, to a small temperature gradient set between the ambient fluid and an axial thin rod mounted on the cavity axis. Two additional parameters are introduced; namely, the Richardson number Ri and the Marangoni number Ma (or the thermocapillary Reynolds number). Results revealed that reducing the rod length induces the formation of on-axis bubbles instead of toroidal structures. Besides, the stagnation characteristics are significantly altered under the combined effects of buoyant-thermocapillary convection. Buoyancy, induced under sufficiently high Ri, was shown to predominate over the thermocapillay motion; causing the enhancement (suppression) of breakdown when the rod is warmer (cooler) than the ambient fluid. However, over small ranges of Ri, the sensitivity of the flow to surface tension gradients was clearly evidenced and results showed its full control over the occurrence and location of breakdown. In particular, detailed timewise evolution of the flow indicated that weak thermocapillary motion was sufficient to prevent the formation of toroidal patterns. These latter detach from the surface and undergo considerable size reduction while moving towards the bulk flow before vanishing. Further calculations revealed that the pattern reappears with increasing time as steady bubble type on the rod. However, in the absence of the central rod and also in the case of small rod length l, the flow evolved into steady state without any breakdown.Keywords: buoyancy, cylinder, surface tension, toroidal vortex
Procedia PDF Downloads 35918 Development and Validation of a Carbon Dioxide TDLAS Sensor for Studies on Fermented Dairy Products
Authors: Lorenzo Cocola, Massimo Fedel, Dragiša Savić, Bojana Danilović, Luca Poletto
Abstract:
An instrument for the detection and evaluation of gaseous carbon dioxide in the headspace of closed containers has been developed in the context of Packsensor Italian-Serbian joint project. The device is based on Tunable Diode Laser Absorption Spectroscopy (TDLAS) with a Wavelength Modulation Spectroscopy (WMS) technique in order to accomplish a non-invasive measurement inside closed containers of fermented dairy products (yogurts and fermented cheese in cups and bottles). The purpose of this instrument is the continuous monitoring of carbon dioxide concentration during incubation and storage of products over a time span of the whole shelf life of the product, in the presence of different microorganisms. The instrument’s optical front end has been designed to be integrated in a thermally stabilized incubator. An embedded computer provides processing of spectral artifacts and storage of an arbitrary set of calibration data allowing a properly calibrated measurement on many samples (cups and bottles) of different shapes and sizes commonly found in the retail distribution. A calibration protocol has been developed in order to be able to calibrate the instrument on the field also on containers which are notoriously difficult to seal properly. This calibration protocol is described and evaluated against reference measurements obtained through an industry standard (sampling) carbon dioxide metering technique. Some sets of validation test measurements on different containers are reported. Two test recordings of carbon dioxide concentration evolution are shown as an example of instrument operation. The first demonstrates the ability to monitor a rapid yeast growth in a contaminated sample through the increase of headspace carbon dioxide. Another experiment shows the dissolution transient with a non-saturated liquid medium in presence of a carbon dioxide rich headspace atmosphere.Keywords: TDLAS, carbon dioxide, cups, headspace, measurement
Procedia PDF Downloads 32417 Comparison of Inexpensive Cell Disruption Techniques for an Oleaginous Yeast
Authors: Scott Nielsen, Luca Longanesi, Chris Chuck
Abstract:
Palm oil is obtained from the flesh and kernel of the fruit of oil palms and is the most productive and inexpensive oil crop. The global demand for palm oil is approximately 75 million metric tonnes, a 29% increase in global production of palm oil since 2016. This expansion of oil palm cultivation has resulted in mass deforestation, vast biodiversity destruction and increasing net greenhouse gas emissions. One possible alternative is to produce a saturated oil, similar to palm, from microbes such as oleaginous yeast. The yeasts can be cultured on sugars derived from second-generation sources and do not compete with tropical forests for land. One highly promising oleaginous yeast for this application is Metschnikowia pulcherrima. However, recent techno-economic modeling has shown that cell lysis and standard lipid extraction are major contributors to the cost of the oil. Typical cell disruption techniques to extract either single cell oils or proteins have been based around bead-beating, homogenization and acid lysis. However, these can have a detrimental effect on lipid quality and are energy-intensive. In this study, a vortex separator, which produces high sheer with minimal energy input, was investigated as a potential low energy method of lysing cells. This was compared to four more traditional methods (thermal lysis, acid lysis, alkaline lysis, and osmotic lysis). For each method, the yeast loading was also examined at 1 g/L, 10 g/L and 100 g/L. The quality of the cell disruption was measured by optical cell density, cell counting and the particle size distribution profile comparison over a 2-hour period. This study demonstrates that the vortex separator is highly effective at lysing the cells and could potentially be used as a simple apparatus for lipid recovery in an oleaginous yeast process. The further development of this technology could potentially reduce the overall cost of microbial lipids in the future.Keywords: palm oil substitute, metschnikowia pulcherrima, cell disruption, cell lysis
Procedia PDF Downloads 20616 Monte Carlo and Biophysics Analysis in a Criminal Trial
Authors: Luca Indovina, Carmela Coppola, Carlo Altucci, Riccardo Barberi, Rocco Romano
Abstract:
In this paper a real court case, held in Italy at the Court of Nola, in which a correct physical description, conducted with both a Monte Carlo and biophysical analysis, would have been sufficient to arrive at conclusions confirmed by documentary evidence, is considered. This will be an example of how forensic physics can be useful in confirming documentary evidence in order to reach hardly questionable conclusions. This was a libel trial in which the defendant, Mr. DS (Defendant for Slander), had falsely accused one of his neighbors, Mr. OP (Offended Person), of having caused him some damages. The damages would have been caused by an external plaster piece that would have detached from the neighbor’s property and would have hit Mr DS while he was in his garden, much more than a meter far away from the facade of the building from which the plaster piece would have detached. In the trial, Mr. DS claimed to have suffered a scratch on his forehead, but he never showed the plaster that had hit him, nor was able to tell from where the plaster would have arrived. Furthermore, Mr. DS presented a medical certificate with a diagnosis of contusion of the cerebral cortex. On the contrary, the images of Mr. OP’s security cameras do not show any movement in the garden of Mr. DS in a long interval of time (about 2 hours) around the time of the alleged accident, nor do they show any people entering or coming out from the house of Mr. DS in the same interval of time. Biophysical analysis shows that both the diagnosis of the medical certificate and the wound declared by the defendant, already in conflict with each other, are not compatible with the fall of external plaster pieces too small to be found. The wind was at a level 1 of the Beaufort scale, that is, unable to raise even dust (level 4 of the Beaufort scale). Therefore, the motion of the plaster pieces can be described as a projectile motion, whereas collisions with the building cornice can be treated using Newtons law of coefficients of restitution. Numerous numerical Monte Carlo simulations show that the pieces of plaster would not have been able to reach even the garden of Mr. DS, let alone a distance over 1.30 meters. Results agree with the documentary evidence (images of Mr. OP’s security cameras) that Mr. DS could not have been hit by plaster pieces coming from Mr. OP’s property.Keywords: biophysics analysis, Monte Carlo simulations, Newton’s law of restitution, projectile motion
Procedia PDF Downloads 13115 Aerodynamic Interaction between Two Speed Skaters Measured in a Closed Wind Tunnel
Authors: Ola Elfmark, Lars M. Bardal, Luca Oggiano, H˚avard Myklebust
Abstract:
Team pursuit is a relatively new event in international long track speed skating. For a single speed skater the aerodynamic drag will account for up to 80% of the braking force, thus reducing the drag can greatly improve the performance. In a team pursuit the interactions between athletes in near proximity will also be essential, but is not well studied. In this study, systematic measurements of the aerodynamic drag, body posture and relative positioning of speed skaters have been performed in the low speed wind tunnel at the Norwegian University of Science and Technology, in order to investigate the aerodynamic interaction between two speed skaters. Drag measurements of static speed skaters drafting, leading, side-by-side, and dynamic drag measurements in a synchronized and unsynchronized movement at different distances, were performed. The projected frontal area was measured for all postures and movements and a blockage correction was performed, as the blockage ratio ranged from 5-15% in the different setups. The static drag measurements where performed on two test subjects in two different postures, a low posture and a high posture, and two different distances between the test subjects 1.5T and 3T where T being the length of the torso (T=0.63m). A drag reduction was observed for all distances and configurations, from 39% to 11.4%, for the drafting test subject. The drag of the leading test subject was only influenced at -1.5T, with the biggest drag reduction of 5.6%. An increase in drag was seen for all side-by-side measurements, the biggest increase was observed to be 25.7%, at the closest distance between the test subjects, and the lowest at 2.7% with ∼ 0.7 m between the test subjects. A clear aerodynamic interaction between the test subjects and their postures was observed for most measurements during static measurements, with results corresponding well to recent studies. For the dynamic measurements, the leading test subject had a drag reduction of 3% even at -3T. The drafting showed a drag reduction of 15% when being in a synchronized (sync) motion with the leading test subject at 4.5T. The maximal drag reduction for both the leading and the drafting test subject were observed when being as close as possible in sync, with a drag reduction of 8.5% and 25.7% respectively. This study emphasize the importance of keeping a synchronized movement by showing that the maximal gain for the leading and drafting dropped to 3.2% and 3.3% respectively when the skaters are in opposite phase. Individual differences in technique also appear to influence the drag of the other test subject.Keywords: aerodynamic interaction, drag force, frontal area, speed skating
Procedia PDF Downloads 13114 Recommendations for Data Quality Filtering of Opportunistic Species Occurrence Data
Authors: Camille Van Eupen, Dirk Maes, Marc Herremans, Kristijn R. R. Swinnen, Ben Somers, Stijn Luca
Abstract:
In ecology, species distribution models are commonly implemented to study species-environment relationships. These models increasingly rely on opportunistic citizen science data when high-quality species records collected through standardized recording protocols are unavailable. While these opportunistic data are abundant, uncertainty is usually high, e.g., due to observer effects or a lack of metadata. Data quality filtering is often used to reduce these types of uncertainty in an attempt to increase the value of studies relying on opportunistic data. However, filtering should not be performed blindly. In this study, recommendations are built for data quality filtering of opportunistic species occurrence data that are used as input for species distribution models. Using an extensive database of 5.7 million citizen science records from 255 species in Flanders, the impact on model performance was quantified by applying three data quality filters, and these results were linked to species traits. More specifically, presence records were filtered based on record attributes that provide information on the observation process or post-entry data validation, and changes in the area under the receiver operating characteristic (AUC), sensitivity, and specificity were analyzed using the Maxent algorithm with and without filtering. Controlling for sample size enabled us to study the combined impact of data quality filtering, i.e., the simultaneous impact of an increase in data quality and a decrease in sample size. Further, the variation among species in their response to data quality filtering was explored by clustering species based on four traits often related to data quality: commonness, popularity, difficulty, and body size. Findings show that model performance is affected by i) the quality of the filtered data, ii) the proportional reduction in sample size caused by filtering and the remaining absolute sample size, and iii) a species ‘quality profile’, resulting from a species classification based on the four traits related to data quality. The findings resulted in recommendations on when and how to filter volunteer generated and opportunistically collected data. This study confirms that correctly processed citizen science data can make a valuable contribution to ecological research and species conservation.Keywords: citizen science, data quality filtering, species distribution models, trait profiles
Procedia PDF Downloads 203